Automatic Recognition of Colloquial Australian Sign Language
WACV-MOTION '05 Proceedings of the IEEE Workshop on Motion and Video Computing (WACV/MOTION'05) - Volume 2 - Volume 02
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Image and video for hearing impaired people
Journal on Image and Video Processing
International Journal of Advanced Media and Communication
Interacting with Digital Signage Using Hand Gestures
ICIAR '09 Proceedings of the 6th International Conference on Image Analysis and Recognition
Modelling and recognition of the linguistic components in American Sign Language
Image and Vision Computing
A framework for continuous multimodal sign language recognition
Proceedings of the 2009 international conference on Multimodal interfaces
A person independent system for recognition of hand postures used in sign language
Pattern Recognition Letters
Energy-based blob analysis for improving precision of skin segmentation
Multimedia Tools and Applications
ISCIS'05 Proceedings of the 20th international conference on Computer and Information Sciences
Head tracking and hand segmentation during hand over face occlusion in sign language
ECCV'10 Proceedings of the 11th European conference on Trends and Topics in Computer Vision - Volume Part I
Non-manual cues in automatic sign language recognition
Personal and Ubiquitous Computing
Hi-index | 0.00 |
A sign language recognition system is required to use information from both global features, such as hand movement and location, and local features, such as hand shape and orientation. In this paper, we present an adequate local feature recognizer for a sign language recognition system. Our basic approach is to represent the hand images extracted from sign-language images as symbols, which correspond to clusters by a clustering technique. The clusters are created from a training set of extracted hand images so that a similar appearance can be classified into the same cluster on an eigenspace. The experimental results indicate that our system can recognize a sign language word even in two-handed and hand-to-hand contact cases.