Interactive two-handed gesture interface in 3D virtual environments
VRST '97 Proceedings of the ACM symposium on Virtual reality software and technology
Soft Computing Based Emotion/Intention Reading for Service Robot
AFSS '02 Proceedings of the 2002 AFSS International Conference on Fuzzy Systems. Calcutta: Advances in Soft Computing
Large vocabulary sign language recognition based on hierarchical decision trees
Proceedings of the 5th international conference on Multimodal interfaces
A vision-based sign language recognition system using tied-mixture density HMM
Proceedings of the 6th international conference on Multimodal interfaces
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recognition of sign language subwords based on boosted hidden Markov models
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Development of an American Sign Language game for deaf children
Proceedings of the 2005 conference on Interaction design and children
American sign language recognition in game development for deaf children
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Effective learning system techniques for human-robot interaction in service environment
Knowledge-Based Systems
Effort analysis in signer-independent sign gestures
Journal of Experimental & Theoretical Artificial Intelligence
Video-based signer-independent Arabic sign language recognition using hidden Markov models
Applied Soft Computing
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Hand posture recognition in video using multiple cues
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Hand movement recognition for brazilian sign language: a study using distance-based neural networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A Chinese sign language recognition system based on SOFM/SRN/HMM
Pattern Recognition
Expert Systems with Applications: An International Journal
Accurate and Accessible Motion-Capture Glove Calibration for Sign Language Data Collection
ACM Transactions on Accessible Computing (TACCESS)
Feature extraction based on multiphase level set framework for sign language recognition system
Proceedings of the 1st Amrita ACM-W Celebration on Women in Computing in India
Transition movement models for large vocabulary continuous sign language recognition
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Hand gesture recognition based on SOM and ART
ICCOMP'06 Proceedings of the 10th WSEAS international conference on Computers
American sign language recognition with the kinect
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Gesture recognition through HITEG data glove to provide a new way of communication
Proceedings of the 4th International Symposium on Applied Sciences in Biomedical and Communication Technologies
Collecting and evaluating the CUNY ASL corpus for research on American Sign Language animation
Computer Speech and Language
Hi-index | 0.00 |
The sign language is a method of communication for the deaf-mute. Articulated gestures and postures of hands and fingers are commonly used for the sign language. This paper presents a system which recognizes the Korean sign language (KSL) and translates into a normal Korean text. A pair of data-gloves are used as the sensing device for detecting motions of hands and fingers. For efficient recognition of gestures and postures, a technique of efficient classification of motions is proposed and a fuzzy min-max neural network is adopted for on-line pattern recognition