Gesture-based interaction with a pet robot

  • Authors:
  • Milyn C. Moy

  • Affiliations:
  • -

  • Venue:
  • AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Pet robots are autonomous robots capable of exhibiting animal-like behaviors, including emotional ones, as they interact with people and objects surrounding them. As pet robots become more integrated into our lives, a more natural way of communicating with them will become necessary. Similarly, they will need to understand human gestures in order to perceive our intentions and communicate with us more effectively. In this paper, we present an extensible, real-time, visionbased communication system that interprets 2D dynamic hand gestures in complex environments. Our strategy for interpreting hand gestures consists of: hand segmentation, feature extraction, and gesture recognition. To segment the hand from the cluttered background, this system uses both motion and color information. The location of the hand is subsequently tracked as the user makes the gesture and its trajectory information is stored in a feature vector. Finally, the gesture is interpreted using this vector and translated into a command that the robot understands. We implemented our system on Yuppy, a pet robot prototype. Currently, via an external microcamera, we can navigate Yuppy in unstructured environments using hand gestures.