Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
"Unvoiced speech recognition using EMG - mime speech recognition"
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Gestures as Input: Neuroelectric Joysticks and Keyboards
IEEE Pervasive Computing
Key-press gestures recognition and interaction based on SEMG signals
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
The influence of performance-oriented widgets on interactive behavior while playing videogames
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
Power me Up!: an interactive and physiological perspective on videogames' temporary bonus rewards
Proceedings of the 4th International Conference on Fun and Games
Hi-index | 0.00 |
Gesture and Speech comprise the most important modalities of human interaction. There has been a considerable amount of research attempts at incorporating these modalities for natural HCI. This involves challenge ranging from the low level signal processing of multi-modal input to the high level interpretation of natural speech and gesture in HCI. This paper proposes novel methods to recognize the hand gestures and unvoiced utterances using surface Electromyogram (sEMG) signals originating from different muscles. The focus of this work is to establish a simple, yet robust system that can be integrated to identify subtle complex hand gestures and unvoiced speech commands for control of prosthesis and other computer assisted devices. The proposed multi-modal system is able to identify the hand gestures and silent utterances using Independent Component Analysis (ICA) and Integral RMS (IRMS) of sEMG respectively. Training of the sEMG features was done using a designed ANN architecture and the results reported with overall recognition accuracy of 90.33%.