Toward Real-Time Human-Computer Interaction with Continuous Dynamic Hand Gestures

  • Authors:
  • Yuanxin Zhu;Haibing Ren;Guangyou Xu;Xueyin Lin

  • Affiliations:
  • -;-;-;-

  • Venue:
  • FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper, aiming at real-time gesture-controlled interaction, describes visual modeling, analysis, and recognition of continuous dynamic hand gestures. By hierarchically integrating multiple cues, a spatio-temporal appearance model and novel approaches are proposed for modeling and analysis of dynamic gestures respectively. At low level, fusion of flesh chrominance analysis and coarse image motion detection is employed to detect and segment hand gestures; at high level, parameters of the spatio-temporal appearance model are recovered by combining robust parameterized image motion estimation and hand shape analysis. The approach, therefore, fulfills real-time processing as well as high recognition rates. Without resorting to any special marks, twelve kinds of hand gestures can be recognized with average accuracy over 89%. A prototype system, gesture-controlled panoramic map browser, is designed and implemented to demonstrate the usability of gesture-controlled interaction.