Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
FlowMenu: combining command, text, and data entry
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Using marking menus to develop command sets for computer vision based hand gesture interfaces
Proceedings of the second Nordic conference on Human-computer interaction
Finger Track - A Robust and Real-Time Gesture Interface
AI '97 Proceedings of the 10th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
Free-hands interaction in augmented reality
Proceedings of the 1st symposium on Spatial user interaction
Hi-index | 0.00 |
This paper presents a barehanded interaction method for augmented reality games based on human hand gestures. Point features are tracked from input video frames and the motion of moving objects is computed. The moving patterns of the motion trajectories are used to determine whether the motion is an intended gesture. A smooth trajectory toward one of virtual objects or menus is classified as an intended gesture and the corresponding action is invoked. To prove the validity of the proposed method, we implemented two simple augmented reality applications: a gesture-based music player and a virtual basketball game. The experiments for three untrained users indicate that the accuracy of menu activation according to the intended gestures is 94% for normal speed gestures and 84% for fast and abrupt gestures.