Evaluation of the CyberGlove as a whole-hand input device
ACM Transactions on Computer-Human Interaction (TOCHI)
Looking at People: Sensing for Ubiquitous and Wearable Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recognition of gestures in Arabic sign language using neuro-fuzzy systems
Artificial Intelligence
A Real-Time Continuous Gesture Recognition System for Sign Language
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Online modulation recognition of analog communication signals using neural network
Expert Systems with Applications: An International Journal
A neural-network approach for an automatic LED inspection system
Expert Systems with Applications: An International Journal
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Accurate and Accessible Motion-Capture Glove Calibration for Sign Language Data Collection
ACM Transactions on Accessible Computing (TACCESS)
Persian sign language (PSL) recognition using wavelet transform and neural networks
Expert Systems with Applications: An International Journal
The parameters effect on performance in ANN for hand gesture recognition system
Expert Systems with Applications: An International Journal
Enhancing hand gesture recognition using fuzzy clustering-based mixture-of-experts model
Proceedings of the 5th International Conference on Ubiquitous Information Management and Communication
Adaptive mixture-of-experts models for data glove interface with multiple users
Expert Systems with Applications: An International Journal
Multimedia Tools and Applications
LSESpeak: A spoken language generator for Deaf people
Expert Systems with Applications: An International Journal
Real-time classification of dynamic hand gestures from marker-based position data
Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion
Thai sign language translation using Scale Invariant Feature Transform and Hidden Markov Models
Pattern Recognition Letters
Methodology for developing an advanced communications system for the Deaf in a new domain
Knowledge-Based Systems
Non-manual cues in automatic sign language recognition
Personal and Ubiquitous Computing
Hi-index | 12.06 |
This study proposes a system for recognizing static gestures in Taiwan sign languages (TSL), using 3D data and neural networks trained to completion. The 3D hand gesture data is acquired from VICON, and are processed and converted to features that are fed to a neural network. The extracted features are invariant for occlusion, rotation, scaling and translation of hand. Experimental results indicate that the proposed system can recognize the 20 static hand gestures in Taiwan sign language with an average accuracy of 96.58%. Additionally, the difference in recognition rate between training set and test set is 3.98%, showing that the proposed system is robust.