A Gesture Based Interface for Human-Robot Interaction

  • Authors:
  • Stefan Waldherr;Roseli Romero;Sebastian Thrun

  • Affiliations:
  • Computer Science Department, Carnegie Mellon University, Pittsburgh, PA, USA;Instituto de Ciências Matemáticas e de Computação, Universidade de São Paulo, São Carlos, SP, Brazil;Computer Science Department, Carnegie Mellon University, Pittsburgh, PA, USA

  • Venue:
  • Autonomous Robots
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Service robotics is currently a highly active research area in robotics, with enormous societal potential. Since service robots directly interact with people, finding “natural” and easy-to-use user interfaces is of fundamental importance. While past work has predominately focussed on issues such as navigation and manipulation, relatively few robotic systems are equipped with flexible user interfaces that permit controlling the robot by “natural” means. This paper describes a gesture interface for the control of a mobile robot equipped with a manipulator. The interface uses a camera to track a person and recognize gestures involving arm motion. A fast, adaptive tracking algorithm enables the robot to track and follow a person reliably through office environments with changing lighting conditions. Two alternative methods for gesture recognition are compared: a template based approach and a neural network approach. Both are combined with the Viterbi algorithm for the recognition of gestures defined through arm motion (in addition to static arm poses). Results are reported in the context of an interactive clean-up task, where a person guides the robot to specific locations that need to be cleaned and instructs the robot to pick up trash.