From Biologically Realistic Imitation to Robot Teaching Via Human Motor Learning

  • Authors:
  • Erhan Oztop;Jan Babic;Joshua Hale;Gordon Cheng;Mitsuo Kawato

  • Affiliations:
  • JST, ICORP, Computational Brain Project, Saitama, Japan and ATR Computational Neuroscience Laboratories, Kyoto, Japan 619-0288;ATR Computational Neuroscience Laboratories, Kyoto, Japan 619-0288 and Department of Automation, Biocybernetics and Robotics, Jozef Stefan Institute, Ljubljana, Slovenia 1000;JST, ICORP, Computational Brain Project, Saitama, Japan and ATR Computational Neuroscience Laboratories, Kyoto, Japan 619-0288;JST, ICORP, Computational Brain Project, Saitama, Japan and ATR Computational Neuroscience Laboratories, Kyoto, Japan 619-0288;JST, ICORP, Computational Brain Project, Saitama, Japan and ATR Computational Neuroscience Laboratories, Kyoto, Japan 619-0288

  • Venue:
  • Neural Information Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Understanding mechanisms of imitation is a complex task in both human sciences and robotics. On the one hand, one can build systems that analyze observed motion, map it to their own body, and produce the motor commands to needed to achieve the inferred motion using engineering techniques. On the other hand, one can model the neural circuits involved in action observation and production in minute detail and hope that imitation will be an emergent property of the system. However if the goal is to build robots capable of skillful actions, midway solutions appear to be more appropriate. In this direction, we first introduce a conceptually biologically realistic neural network that can learn to imitate hand postures, either with the help of a teacher or by self-observation. Then we move to a paradigm we have recently proposed, where robot skill synthesis is achieved by exploiting the human capacity to learn novel control tasks.