Using Multiple Sensors for Mobile Sign Language Recognition
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
Enabling fast and effortless customisation in accelerometer based gesture interaction
Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia
Toward subtle intimate interfaces for mobile devices using an EMG controller
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EMG-based hand gesture recognition for realtime biosignal interfacing
Proceedings of the 13th international conference on Intelligent user interfaces
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Making muscle-computer interfaces more practical
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Automatic recognition of sign language subwords based on portable accelerometer and EMG sensors
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Head-computer interface: a multimodal approach to navigate through real and virtual worlds
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II
Feature fusion for 3D hand gesture recognition by learning a shared hidden space
Pattern Recognition Letters
An adaptive solution for intra-operative gesture-based human-machine interaction
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Emerging Input Technologies for Always-Available Mobile Interaction
Foundations and Trends in Human-Computer Interaction
Human computer interaction with hand gestures in virtual environment
PerMIn'12 Proceedings of the First Indo-Japan conference on Perception and Machine Intelligence
Hi-index | 0.00 |
This paper describes a novel hand gesture recognition system that utilizes both multi-channel surface electromyogram (EMG) sensors and 3D accelerometer (ACC) to realize user-friendly interaction between human and computers. Signal segments of meaningful gestures are determined from the continuous EMG signal inputs. Multi-stream Hidden Markov Models consisting of EMG and ACC streams are utilized as decision fusion method to recognize hand gestures. This paper also presents a virtual Rubik's Cube game that is controlled by the hand gestures and is used for evaluating the performance of our hand gesture recognition system. For a set of 18 kinds of gestures, each trained with 10 repetitions, the average recognition accuracy was about 91.7% in real application. The proposed method facilitates intelligent and natural control based on gesture interaction.