Recognising postures and gestures using neural networks
Neural networks and pattern recognition in human-computer interaction
Dimension reduction by local principal component analysis
Neural Computation
Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hierarchical Discriminant Analysis for Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Appearance-based hand sign recognition from intensity image sequences
Computer Vision and Image Understanding
Robust Real-Time Periodic Motion Detection, Analysis, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Extraction of 2D Motion Trajectories and Its Application to Hand Gesture Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Scalability in ASL Recognition: Breaking Down Signs into Phonemes
GW '99 Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction
The Recognition of Finger-Spelling for Chinese Sign Language
GW '01 Revised Papers from the International Gesture Workshop on Gesture and Sign Languages in Human-Computer Interaction
Relevant Features for Video-Based Continuous Sign Language Recognition
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
A framework for motion recognition with applications to American sign language and gait recognition
HUMO '00 Proceedings of the Workshop on Human Motion (HUMO'00)
View-Based Detection and Analysis of Periodic Motion
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
A Novel Two-Layer PCA/MDA Scheme for Hand Posture Recognition
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 1 - Volume 1
Head Pose Estimation Using View Based Eigenspaces
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 4 - Volume 4
ASL Recognition Based on a Coupling Between HMMs and 3D Motion Analysis
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A vision-based sign language recognition system using tied-mixture density HMM
Proceedings of the 6th international conference on Multimodal interfaces
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEICE - Transactions on Information and Systems
Understanding gestures with systematic variations in movement dynamics
Pattern Recognition
Fast learning in networks of locally-tuned processing units
Neural Computation
3-D hand trajectory recognition for signing exact English
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
A boosted classifier tree for hand shape detection
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
ISNN'05 Proceedings of the Second international conference on Advances in neural networks - Volume Part II
Hi-index | 0.01 |
We present effective and robust algorithms to recognize isolated signs in Signing Exact English (SEE). The sign-level recognition scheme comprises classifiers for handshape, hand movement and hand location. The SEE gesture data are acquired using CyberGlove^(R) and magnetic trackers. A linear decision tree with Fisher's linear discriminant (FLD) is used to classify 27 SEE handshapes. Hand movement trajectory is classified using vector quantization principal component analysis (VQPCA). Both periodic and non-periodic SEE sign gestures are recognized from isolated 3-D hand trajectories. Experiments yielded average handshape recognition accuracy of 96.1% on ''unseen'' signers. The average trajectory recognition rate with VQPCA for non-periodic and periodic gestures was 97.3% and 97.0%, respectively. These classifiers were combined with a hand location classifier for sign-level recognition, yielding an accuracy of 86.8% on a 28 sign SEE vocabulary.