Learning to interpret pointing gestures: experiments with four-legged autonomous robots

  • Authors:
  • Verena V. Hafner;Frédéric Kaplan

  • Affiliations:
  • Sony CSL Paris, Paris, France;Sony CSL Paris, Paris, France

  • Venue:
  • Biomimetic Neural Learning for Intelligent Robots
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In order to bootstrap shared communication systems, robots must have a non-verbal way to influence the attention of one another. This chapter presents an experiment in which a robot learns to interpret pointing gestures of another robot. We show that simple feature-based neural learning techniques permit reliably to discriminate between left and right pointing gestures. This is a first step towards more complex attention coordination behaviour. We discuss the results of this experiment in relation to possible developmental scenarios about how children learn to interpret pointing gestures.