Gesture Recognition Using 3D Appearance and Motion Features

  • Authors:
  • Guangqi Ye;Jason J. Corso;Gregory D. Hager

  • Affiliations:
  • The Johns Hopkins University;The Johns Hopkins University;The Johns Hopkins University

  • Venue:
  • CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 10 - Volume 10
  • Year:
  • 2004

Quantified Score

Hi-index 0.01

Visualization

Abstract

We present a novel 3D gesture recognition scheme that combines the 3D appearance of the hand and the motion dynamics of the gesture to classify manipulative and controlling gestures. Our method does not directly track the hand. Instead, we take an object-centered approach that efficiently computes the 3D appearance using a region-based coarse stereo matching algorithm in a volume around the hand. The motion cue is captured via differentiating the appearance feature. An unsupervised learning scheme is carried out to capture the cluster structure of these feature-volumes. Then, the image sequence of a gesture is converted to a series of symbols that indicate the cluster identities of each image pair. Two schemes (forward HMMs and neural networks) are used to model the dynamics of the gestures. We implemented a real-time system and performed numerous gesture recognition experiments to analyze the performance with different combinations of the appearance and motion features. The system achieves recognition accuracy of over 96% using both the proposed appearance and the motion cues.