Generality and legibility in mobile manipulation

  • Authors:
  • Michael Beetz;Freek Stulp;Piotr Esden-Tempski;Andreas Fedrizzi;Ulrich Klank;Ingo Kresse;Alexis Maldonado;Federico Ruiz

  • Affiliations:
  • Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748;Intelligent Autonomous Systems Group, Department of Informatics, Technische Universität München, Garching bei München, Germany 85748

  • Venue:
  • Autonomous Robots
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article investigates methods for achieving more general manipulation capabilities for mobile manipulation platforms, which produce legible behavior in human living environments. To achieve generality and legibility, we combine two control mechanisms. First of all, experience- and observation-based learning of skills is applied to routine tasks, so that the repetitive and stereotypical character of everyday activity is exploited. Second, we use planning, reasoning, and search for novel tasks which have no stereotypical solution. We apply these ideas to the learning and use of action-related places, to the model-based visual recognition and localization of objects, and the learning and application of reaching strategies and motions from humans. We demonstrate the integration of these mechanisms into a single low-level control system for autonomous manipulation platforms.