Learning dynamics for exemplar-based gesture recognition

  • Authors:
  • Ahmed Elgammal;Vinay Shet;Yaser Yacoob;Larry S. Davis

  • Affiliations:
  • Department of Computer Science, Rutgers University, Piscataway, NJ;Computer Vision Laboratory, University of Maryland, College Park, MD;Computer Vision Laboratory, University of Maryland, College Park, MD;Computer Vision Laboratory, University of Maryland, College Park, MD

  • Venue:
  • CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the problem of capturing the dynamics for exemplar-based recognition systems. Traditional HMM provides a probabilistic tool to capture system dynamics and in exemplar paradigm, HMM states are typically coupled with the exemplars. Alternatively, we propose a non-parametric HMM approach that uses a discrete HMM with arbitrary states (decoupled from exemplars) to capture the dynamics over a large exemplar space where a nonparametric estimation approach is used to model the exemplar distribution. This reduces the need for lengthy and non-optimal training of the HMM observation model. We used the proposed approach for view-based recognition of gestures. The approach is based on representing each gesture as a sequence of learned body poses (exemplars). The gestures are recognized through a probabilistic framework for matching these body poses and for imposing temporal constraints between different poses using the proposed nonparametric HMM.