Nonlinear PHMMs for the Interpretation of Parameterized Gesture

  • Authors:
  • A. D. Wilson;A. F. Bobick

  • Affiliations:
  • -;-

  • Venue:
  • CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

In previous work [14], we modify the hidden Markov model (HMM) framework to incorporate a global parametric variation in the output probabilities of the states of the HMM. Development of the parametric hidden Markov model (PHMM) was motivated by the task of simultaneously recognizing and interpreting gestures that exhibit meaningful variation. With standard HMMs, such global variation confounds the recognition process. The original PHMM approach assumes a linear dependence of output density means on the global parameter. In this paper we extend the PHMM to handle arbitrary smooth (nonlinear) dependencies. We show a generalized expectation-maximization (GEM) algorithm for training the PHMM and a GEM algorithm to simultaneously recognize the gesture and estimate the value of the parameter. We present results on a pointing gesture, where the nonlinear approach permits the natural azimuth/elevation parameterization of pointing direction.