Human computer interaction with hand gestures in virtual environment
PerMIn'12 Proceedings of the First Indo-Japan conference on Perception and Machine Intelligence
Low cost remote gaze gesture recognition in real time
Applied Soft Computing
MLDM'13 Proceedings of the 9th international conference on Machine Learning and Data Mining in Pattern Recognition
PCA & HMM based arm gesture recognition using inertial measurement unit
BodyNets '13 Proceedings of the 8th International Conference on Body Area Networks
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
This paper presents a framework for hand gesture recognition based on the information fusion of a three-axis accelerometer (ACC) and multichannel electromyography (EMG) sensors. In our framework, the start and end points of meaningful gesture segments are detected automatically by the intensity of the EMG signals. A decision tree and multistream hidden Markov models are utilized as decision-level fusion to get the final results. For sign language recognition (SLR), experimental results on the classification of 72 Chinese Sign Language (CSL) words demonstrate the complementary functionality of the ACC and EMG sensors and the effectiveness of our framework. Additionally, the recognition of 40 CSL sentences is implemented to evaluate our framework for continuous SLR. For gesture-based control, a real-time interactive system is built as a virtual Rubik's cube game using 18 kinds of hand gestures as control commands. While ten subjects play the game, the performance is also examined in user-specific and user-independent classification. Our proposed framework facilitates intelligent and natural control in gesture-based interaction.