Human motion analysis: a review
Computer Vision and Image Understanding
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multi-Scale Gesture Recognition from Time-Varying Contours
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
A Full-Body Gesture Database for Automatic Gesture Recognition
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
A Detection Technique for Degraded Face Images
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Conversational gestures in human-robot interaction
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Efficient small face detection in surveillance images using major color component and LDA scheme
CIS'05 Proceedings of the 2005 international conference on Computational Intelligence and Security - Volume Part II
Gesture Spotting and Recognition for Human–Robot Interaction
IEEE Transactions on Robotics
Robust real-time face detection using face certainty map
ICB'07 Proceedings of the 2007 international conference on Advances in Biometrics
Multimedia technology for pervasive computing environment
The Journal of Supercomputing
Hi-index | 0.00 |
This paper proposes a vision-based human arm gesture recognition method for human---robot interaction, particularly at a long distance where speech information is not available. We define four meaningful arm gestures for a long-range interaction. The proposed method is capable of recognizing the defined gestures only with 320脳240 pixel-sized low-resolution input images captured from a single camera at a long distance, approximately five meters from the camera. In addition, the system differentiates the target gestures from the users' normal actions that occur in daily life without any constraints. For human detection at a long distance, the proposed approach combines results from mean-shift color tracking, short- and long-range face detection, and omega shape detection. The system then detects arm blocks using a background subtraction method with a background updating module and recognizes the target gestures based on information about the region, periodical motion, and shape of the arm blocks. From experiments using a large realistic database, a recognition rate of 97.235% is achieved, which is a sufficiently practical level for various pervasive and ubiquitous applications based on human gestures.