Hand Gesture Recognition Using Input-Output Hidden Markov Models
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Analysis of Rotational Robustness of Hand Detection with a Viola-Jones Detector
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
On the usability of gesture interfaces in virtual reality environments
CLIHC '05 Proceedings of the 2005 Latin American conference on Human-computer interaction
Vision-based hand pose estimation: A review
Computer Vision and Image Understanding
Hand Gesture Detection and Segmentation Based on Difference Background Image with Complex Background
ICESS '08 Proceedings of the 2008 International Conference on Embedded Software and Systems
International Journal of Advanced Media and Communication
Vision-Based Human-Computer System Using Hand Gestures
CIS '09 Proceedings of the 2009 International Conference on Computational Intelligence and Security - Volume 02
A Real Time Vision-Based Hand Gesture Interaction
AMS '10 Proceedings of the 2010 Fourth Asia International Conference on Mathematical/Analytical Modelling and Computer Simulation
Vision-Based Hand Gesture Recognition Using Combinational Features
IIH-MSP '10 Proceedings of the 2010 Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor
Proceedings of the 2011 conference on Information technology education
Code space: touch + air gesture hybrid interactions for supporting developer meetings
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Design and usability analysis of gesture-based control for common desktop tasks
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Hi-index | 0.00 |
This paper proposes the results of a user study on vision-based depth-sensitive input system for performing typical desktop tasks through arm gestures. We have developed a vision-based HCI prototype to be used for our comprehensive usability study. Using the Kinect 3D camera and OpenNI software library we implemented our system with high stability and efficiency by decreasing the ambient disturbing factors such as noise or light condition dependency. In our prototype, we designed a capable algorithm using NITE toolkit to recognize arm gestures. Finally, through a comprehensive user experiment we compared our natural arm gestures to the conventional input devices (mouse/keyboard), for simple and complicated tasks, and in two different situations (small and big-screen displays) for precision, efficiency, ease-of-use, pleasantness, fatigue, naturalness, and overall satisfaction to verify the following hypothesis: on a WIMP user interface, the gesture-based input is superior to mouse/keyboard when using big-screen. Our empirical investigation also proves that gestures are more natural and pleasant to be used than mouse/keyboard. However, arm gestures can cause more fatigue than mouse.