Segmentation and Tracking for Vision Based Human Robot Interaction

  • Authors:
  • Salman Valibeik;Guang-Zhong Yang

  • Affiliations:
  • -;-

  • Venue:
  • WI-IAT '08 Proceedings of the 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology - Volume 03
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Vision based Human Robot Interaction (HRI) in a crowded scene is a challenging research problem. The aim of this paper is to provide a reliable framework for simple gesture recognition for robotic navigation under partial occlusion and varying illumination conditions. The proposed method combines hand motion segmentation and skin colour detection for gesture recognition. Motion clustering based on Least Median Square Error (LMedS) followed by Kalman filtering and HMM gesture detection has been used. Experimental results have shown that the method can successfully restore the motion field that allows accurate, dominant affine motion detection for consistent gesture estimation.