C4.5: programs for machine learning
C4.5: programs for machine learning
Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Active shape models—their training and application
Computer Vision and Image Understanding
Modality integration: speech and gesture
Survey of the state of the art in human language technology
Machine Learning
An Appearance-Based Approach to Gesture-Recognition
ICIAP '97 Proceedings of the 9th International Conference on Image Analysis and Processing-Volume II
Gesture Recognition Using Colored Gloves
ICPR '96 Proceedings of the 1996 International Conference on Pattern Recognition (ICPR '96) Volume I - Volume 7270
A highly efficient system for automatic face region detection in MPEG video
IEEE Transactions on Circuits and Systems for Video Technology
Hi-index | 0.00 |
We present in this paper an approach of hand gesture analysis that aims at recognizing a digit. The analysis is based on extracting a set of features from a hand image and then combining them by using an induction graph. The most important features we extract from each image are the fingers locations, their heights and the distance between each pair of fingers. Our approach consists of three steps: (i) Hand localization, (ii) fingers extraction and (iii) features identification and combination to digit recognition. Each input image is assumed to contain only one hand with black background, thus we apply a classifier based on one skin color to identify the skin pixels. In the finger extraction step, we attempt to remove all the hand components except the fingers, this process is based on the hand anatomy properties. The final step is based on histogram representation of the detected fingers which results in the features identification, which results in the digit recognition. The approach is invariant to scale, rotation and translation of the hand. Some experiments have been undertaken to show the effectivness of the proposed approach.