Robot vision
Specifying gestures by example
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
A hand gesture interface device
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Hidden Markov Models for Speech Recognition
Hidden Markov Models for Speech Recognition
Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking
ECCV '94 Proceedings of the Third European Conference-Volume II on Computer Vision - Volume II
Hand Tension as a Gesture Segmentation Cue
Proceedings of Gesture Workshop on Progress in Gestural Interaction
Gesture recognition using the Perseus architecture
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
Real-Time Hand-Arm Motion Analysis using a single Video Camera
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Toward Robust Skin Identification in Video Images
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Towards 3D hand tracking using a deformable model
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Recognizing and interpreting gestures on a mobile robot
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
Interacting with a pet robot using hand gestures
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Application of Markerless Image-Based Arm Tracking to Robot-Manipulator Teleoperation
CRV '04 Proceedings of the 1st Canadian Conference on Computer and Robot Vision
HMM-Based gesture recognition for robot control
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part I
Hi-index | 0.00 |
Pet robots are autonomous robots capable of exhibiting animal-like behaviors, including emotional ones, as they interact with people and objects surrounding them. As pet robots become more integrated into our lives, a more natural way of communicating with them will become necessary. Similarly, they will need to understand human gestures in order to perceive our intentions and communicate with us more effectively. In this paper, we present an extensible, real-time, visionbased communication system that interprets 2D dynamic hand gestures in complex environments. Our strategy for interpreting hand gestures consists of: hand segmentation, feature extraction, and gesture recognition. To segment the hand from the cluttered background, this system uses both motion and color information. The location of the hand is subsequently tracked as the user makes the gesture and its trajectory information is stored in a feature vector. Finally, the gesture is interpreted using this vector and translated into a command that the robot understands. We implemented our system on Yuppy, a pet robot prototype. Currently, via an external microcamera, we can navigate Yuppy in unstructured environments using hand gestures.