Robot programming by demonstration

  • Authors:
  • Krishna Kumar Narayanan;Luis Felipe Posada;Frank Hoffmann;Torsten Bertram

  • Affiliations:
  • Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany;Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany;Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany;Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany

  • Venue:
  • SIMPAR'10 Proceedings of the Second international conference on Simulation, modeling, and programming for autonomous robots
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The manual design of vision based robotic behaviors remains a substantial challenge due to the complexity of visual information. It takes an ample effort to draw meaningful relationships and constraints between the acquired image perception and the geometry of environment both empirically and programmatically. This contribution proposes an alternative framework for learning autonomous visual navigation behavior from demonstration examples by integrating 3D range and an omnidirectional camera. A programming by demonstration approach is utilized to learn the demonstrated trajectories as a mapping between visual features computed on the omnidirectional image onto a corresponding robot motion. Exhaustive tests are performed to identify the discriminant features in order to mimic the teacher demonstrations. The relationship between perception and action is learned from demonstrations by means of locally weighted regression and artificial neural networks. The experimental results on the mobile robot indicate that the acquired visual behavior is robust and is able to generalize and optimize its performance to environments not presented during training.