Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recent developments in visual sign language recognition
Universal Access in the Information Society
Proceedings of the 1st international conference on PErvasive Technologies Related to Assistive Environments
Sign Language Recognition by Combining Statistical DTW and Independent Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Person-Independent 3D Sign Language Recognition
Gesture-Based Human-Computer Interaction and Simulation
Human-inspired search for redundancy in automatic sign language recognition
ACM Transactions on Applied Perception (TAP)
Local binary pattern based features for sign language recognition
Pattern Recognition and Image Analysis
Local Binary Pattern based features for sign language recognition
Pattern Recognition and Image Analysis
Genetic eigenhand selection for handshape classification based on compact hand extraction
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
Research on automatic sign language recognition (ASLR) has mostly been conducted from a machine learning perspective. We propose to implement results from human sign recognition studies in ASLR. In a previous study it was found that handshape is important for human sign recognition. The current paper describes the implementation of this conclusion: using handshape in ASLR. Handshape information in three different representations is added to an existing ASLR system. The results show that recognition improves, except for one representation. This refutes the idea that extra (handshape) information will always improve recognition. Results also vary per sign: some sign classifiers improve greatly, others are unaffected, and rare cases even show decreased performance. Adapting classifiers to specific sign types could be the key for future ASLR.