The nature of statistical learning theory
The nature of statistical learning theory
The Recognition of Human Movement Using Temporal Templates
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Automated extraction and parameterization of motions in large data sets
ACM SIGGRAPH 2004 Papers
An efficient search algorithm for motion data using weighted PCA
Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation
Motion templates for automatic classification and retrieval of motion capture data
Proceedings of the 2006 ACM SIGGRAPH/Eurographics symposium on Computer animation
Action categorization with modified hidden conditional random field
Pattern Recognition
Efficient and robust annotation of motion capture data
Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Minimal-latency human action recognition using reliable-inference
Image and Vision Computing
Modeling temporal structure of decomposable motion segments for activity classification
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part II
Real-time classification of dance gestures from skeleton animation
SCA '11 Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Real-time human pose recognition in parts from single depth images
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Expandable Data-Driven Graphical Modeling of Human Actions Based on Salient Postures
IEEE Transactions on Circuits and Systems for Video Technology
Recognizing actions using depth motion maps-based histograms of oriented gradients
Proceedings of the 20th ACM international conference on Multimedia
Real-Time Gesture Recognition from Depth Data through Key Poses Learning and Decision Forests
SIBGRAPI '12 Proceedings of the 2012 25th SIBGRAPI Conference on Graphics, Patterns and Images
Hi-index | 0.10 |
The recent popularization of real time depth sensors has diversified the potential applications of online gesture recognition to end-user natural user interface (NUI). This requires significant robustness of the gesture recognition to cope with the noisy data from the popular depth sensor, while the quality of the final NUI heavily depends on the recognition execution speed. This work introduces a method for real-time gesture recognition from a noisy skeleton stream, such as those extracted from Kinect depth sensors. Each pose is described using an angular representation of the skeleton joints. Those descriptors serve to identify key poses through a Support Vector Machine multi-class classifier, with a tailored pose kernel. The gesture is labeled on-the-fly from the key pose sequence with a decision forest, which naturally performs the gesture time control/warping and avoids the requirement for an initial or neutral pose. The proposed method runs in real time and its robustness is evaluated in several experiments.