Gestures as Input: Neuroelectric Joysticks and Keyboards
IEEE Pervasive Computing
Speech and Language Processing for Multimodal Human-Computer Interaction
Journal of VLSI Signal Processing Systems
Toward subtle intimate interfaces for mobile devices using an EMG controller
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multi modal gesture identification for HCI using surface EMG
Proceedings of the 12th international conference on Entertainment and media in the ubiquitous era
ICMB'08 Proceedings of the 1st international conference on Medical biometrics
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Robust coding schemes for indexing and retrieval from large face databases
IEEE Transactions on Image Processing
Hi-index | 0.00 |
This article conducted research on the pattern recognition of keypress finger gestures based on surface electromyographic (SEMG) signals and the feasibility of key -press gestures for interaction application. Two sort of recognition experiments were designed firstly to explore the feasibility and repeatability of the SEMG -based classification of 1 6 key-press finger gestures relating to right hand and 4 control gestures, and the key -press gestures were defined referring to the standard PC key board. Based on the experimental results, 10 quite well recognized key -press gestures were selected as numeric input keys of a simulated phone, and the 4 control gestures were mapped to 4 control keys. Then two types of use tests, namely volume setting and SMS sending were conducted to survey the gesture-base interaction performance and user's attitude to this technique, and the test results showed that users could accept this novel input strategy with fresh experience.