Recognizing and interpreting gestures on a mobile robot

  • Authors:
  • David Kortenkamp;Eric Huber;R. Peter Bonasso

  • Affiliations:
  • Metrica, Inc., NASA Johnson Space Center, ER2, Houston, TX;Metrica, Inc., NASA Johnson Space Center, ER2, Houston, TX;Metrica, Inc., NASA Johnson Space Center, ER2, Houston, TX

  • Venue:
  • AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
  • Year:
  • 1996

Quantified Score

Hi-index 0.02

Visualization

Abstract

Gesture recognition is an important skill for robots that work closely with humans. Gestures help to clarify spoken commands and are a compact means of relaying geometric information. We have developed a real-time, three-dimensional gesture recognition system that resides on-board a mobile robot. Using a coarse three-dimensional model of a human to guide stereo measurements of body parts, the system is capable of recognizing six distinct gestures made by an unadorned human in an unaltered environment. An active vision approach focuses the vision system's attention on small, moving areas of space to allow for frame rate processing even when the person and/or the robot are moving. This paper describes the gesture recognition system, including the coarse model and the active vision approach. This paper also describes how the gesture recognition system is integrated with an intelligent control architecture to allow for complex gesture interpretation and complex robot action. Results from experiments with an actual mobile robot are given.