Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video
IEEE Transactions on Pattern Analysis and Machine Intelligence
Towards an Automatic Sign Language Recognition System Using Subunits
GW '01 Revised Papers from the International Gesture Workshop on Gesture and Sign Languages in Human-Computer Interaction
A Real-Time Continuous Gesture Recognition System for Sign Language
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
CSLDS: Chinese Sign Language Dialog System
AMFG '03 Proceedings of the IEEE International Workshop on Analysis and Modeling of Faces and Gestures
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rapid Signer Adaptation for Isolated Sign Language Recognition
CVPRW '06 Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop
Human-inspired search for redundancy in automatic sign language recognition
ACM Transactions on Applied Perception (TAP)
Influence of handshape information on automatic sign language recognition
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
Hi-index | 0.00 |
In this paper, we present a person independent 3D system for judging the correctness of a sign. The system is camera-based, using computer vision techniques to track the hand and extract features. 3D co-ordinates of the hands and other features are calculated from stereo images. The features are then modeled statistically and automatic feature selection is used to build the classifiers. Each classifier is meant to judge the correctness of one sign. We tested our approach using a 120-sign vocabulary and 75 different signers. Overall, a true positive rate of 96.5% at a false positive rate of 3.5% is achieved. The system's performance in a real-world setting largely agreed with human expert judgement.