Motion analysis for human-robot interaction

  • Authors:
  • Kye Kyung Kim;Hae Jin Kim;Jae Yeon Lee

  • Affiliations:
  • Electronics Telecommunications Research Institute, Korea;Electronics Telecommunications Research Institute, Korea;Electronics Telecommunications Research Institute, Korea

  • Venue:
  • MMACTE'05 Proceedings of the 7th WSEAS International Conference on Mathematical Methods and Computational Techniques In Electrical Engineering
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper is to present vision based motion analysis for human-robot interaction, which analyzes camera motion and human motion. Motion of camera is compensated by comparing edge features among consecutive image frames. Candidate regions of human motion are found by differencing between transformed tth image and t-1th image. Human motion is finally decided by image features and motion analysis. Gesture recognition module detects moving hand by motion analysis and skin color information obtained from face detection. The variation of hand location and the meaning gesture region are detected. We have experimented detection of moving object and gesture recognition with an active camera, which is pan/tilt/zoom and single camera that is mounted on mobile robot. Performance evaluation of gesture recognition has experimented using ETRI database and an encouraging recognition rate of 84% has been obtained.