Pfinder: Real-Time Tracking of the Human Body
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Combining 2D Feature Tracking and Volume Reconstruction for Online Video-Based Human Motion Capture
PG '02 Proceedings of the 10th Pacific Conference on Computer Graphics and Applications
Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 10 - Volume 10
Human motion estimation from a reduced marker set
I3D '06 Proceedings of the 2006 symposium on Interactive 3D graphics and games
Hands and face tracking for VR applications
Computers and Graphics
Layered Architecture for Real-Time Sign Recognition
The Computer Journal
Cyclic and non-cyclic gesture spotting and classification in real-time applications
AMDO'10 Proceedings of the 6th international conference on Articulated motion and deformable objects
Interactive and stereoscopic hybrid 3d viewer of radar data with gesture recognition
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
Hi-index | 0.00 |
In this paper we present a novel method called Temporal Nearest End-Effectors (TNEE) to automatically classify full-body human actions captured in real-time. This method uses a simple representation for modeling actions based exclusively on the recent positions of the user's end-effectors, i.e. hands, head and feet, relative to the pelvis. With this method, the essential information of full-body movements is retained in a reduced form. The recognition procedure combines the evaluation of the performed poses and the temporal coherence. The performance of TNEE is tested with real motion capture data obtaining satisfactory results for real-time applications.