The Recognition of Human Movement Using Temporal Templates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Vision-Based Gesture Recognition: A Review
GW '99 Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction
Recognizing Action at a Distance
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
ICML '04 Proceedings of the twenty-first international conference on Machine learning
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Conditional Random Fields for Contextual Human Motion Recognition
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Hidden Conditional Random Fields for Gesture Recognition
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Hierarchical Gaussian process latent variable models
Proceedings of the 24th international conference on Machine learning
Hi-index | 0.00 |
We describe a hierarchical approach for recognizing continuous hand gestures. It consists of hierarchical nonlinear dimensionality reduction based feature extraction and Hierarchical Conditional Random Field (Hierarchical CRF) based motion modeling. Articulated hands can be decomposed into several hand parts and we explore the underlying structures of articulated action spaces for both the hand and hand parts using Hierarchical Gaussian Process Latent Variable Model (HGPLVM). In this hierarchical latent variable space, we propose a Hierarchical CRF, which can simultaneously capture the extrinsic class dynamics and learn the relationship between motions of hand parts and class labels, to model the hand motions. Approving recognition performance is obtained on our user-defined hand gesture dataset.