ACM SIGCAPH Computers and the Physically Handicapped
The automatic recognition of gestures
The automatic recognition of gestures
Toward Scalability in ASL Recognition: Breaking Down Signs into Phonemes
GW '99 Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction
A Real-Time Continuous Gesture Recognition System for Sign Language
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
A Multi-Class Pattern Recognition System for Practical Finger Spelling Translation
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
A new instrumented approach for translating the american sign language into sound and text
A new instrumented approach for translating the american sign language into sound and text
The AcceleGlove: a whole-hand input device for virtual reality
ACM SIGGRAPH 2002 conference abstracts and applications
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Gesture-driven American sign language phraselator
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Development of an American Sign Language game for deaf children
Proceedings of the 2005 conference on Interaction design and children
American sign language recognition in game development for deaf children
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
AcceleSpell, a gestural interactive game to learn and practice finger spelling
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Modelling and recognition of the linguistic components in American Sign Language
Image and Vision Computing
Energy-based blob analysis for improving precision of skin segmentation
Multimedia Tools and Applications
American sign language recognition with the kinect
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Kinect-based visual communication system
Proceedings of the 4th International Conference on Internet Multimedia Computing and Service
Exploring sensor gloves for teaching children sign language
Advances in Human-Computer Interaction
Non-manual cues in automatic sign language recognition
Personal and Ubiquitous Computing
Hi-index | 0.00 |
This paper discusses a novel approach for capturing and translating isolated gestures of American Sign Language into spoken and written words. The instrumented part of the system combines an AcceleGlove and a two-link arm skeleton. Gestures of the American Sign Language are broken down into unique sequences of phonemes called Poses and Movements, recognized by software modules trained and tested independently on volunteers with different hand sizes and signing ability. Recognition rates of independent modules reached up to 100% for 42 postures, 6 orientations, 11 locations and 7 movements using linear classification. The overall sign recognizer was tested using a subset of the American Sign Language dictionary comprised by 30 one-handed signs, achieving 98% accuracy. The system proved to be scalable: when the lexicon was extended to 176 signs and tested without retraining, the accuracy was 95%. This represents an improvement over classification based on Hidden Markov Models and Neural Network.