Situated learning of visual robot behaviors

  • Authors:
  • Krishna Kumar Narayanan;Luis-Felipe Posada;Frank Hoffmann;Torsten Bertram

  • Affiliations:
  • Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany;Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany;Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany;Institute of Control Theory and Systems Engineering, Technische Universität Dortmund, Dortmund, Germany

  • Venue:
  • ICIRA'11 Proceedings of the 4th international conference on Intelligent Robotics and Applications - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a new robot learning framework to acquire scenario specific autonomous behaviors by demonstration. We extract visual features from the demonstrated behavior examples in an indoor environment and transfer it onto an underlying set of scenario aware robot behaviors. Demonstrations are performed using an omnidirectional camera as training instances in different indoor scenarios are registered.The features that distinguish the environment are identified and are used to classify the traversing scenarios. Once the scenario is identified, a behavior model trained by means of artificial neural network pertaining to the specific scenario is learned. The generalization ability of the behavior model is evaluated for seen and unseen data. As a comparison, the behaviors attained using a monolithic general purpose model and its generalization ability against the former is evaluated. The experimental results on the mobile robot indicate the acquired behavior is robust and generalizes meaningful actions beyond the specifics presented during training.