Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video
IEEE Transactions on Pattern Analysis and Machine Intelligence
Appearance-based hand sign recognition from intensity image sequences
Computer Vision and Image Understanding
The Recognition of Human Movement Using Temporal Templates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rotation Invariant Neural Network-Based Face Detection
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Efficient Visual Event Detection Using Volumetric Features
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Hidden Conditional Random Fields for Gesture Recognition
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Real Time Large Vocabulary Continuous Sign Language Recognition Based on OP/Viterbi Algorithm
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
IEEE Transactions on Pattern Analysis and Machine Intelligence
Towards automated large vocabulary gesture search
Proceedings of the 2nd International Conference on PErvasive Technologies Related to Assistive Environments
Robust person-independent visual sign language recognition
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part I
Toward a 3D body part detection video dataset and hand tracking benchmark
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments
Hi-index | 0.00 |
A method is presented to help users look up the meaning of an unknown sign from American Sign Language (ASL). The user submits a video of the unknown sign as a query, and the system retrieves the most similar signs from a database of sign videos. The user then reviews the retrieved videos to identify the video displaying the sign of interest. Hands are detected in a semi-automatic way: the system performs some hand detection and tracking, and the user has the option to verify and correct the detected hand locations. Features are extracted based on hand motion and hand appearance. Similarity between signs is measured by combining dynamic time warping (DTW) scores, which are based on hand motion, with a simple similarity measure based on hand appearance. In user-independent experiments, with a system vocabulary of 1,113 signs, the correct sign was included in the top 10 matches for 78% of the test queries.