Natural demonstration of manipulation skills for multimodal interactive robots

  • Authors:
  • Markus Hüser;Tim Baier-Löwenstein;Marina Svagusa;Jianwei Zhang

  • Affiliations:
  • University of Hamburg, Faculty of Mathematics, Informatics and Natural Sciences, Department Informatics, Group TAMS;University of Hamburg, Faculty of Mathematics, Informatics and Natural Sciences, Department Informatics, Group TAMS;University of Hamburg, Faculty of Mathematics, Informatics and Natural Sciences, Department Informatics, Group TAMS;University of Hamburg, Faculty of Mathematics, Informatics and Natural Sciences, Department Informatics, Group TAMS

  • Venue:
  • UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: ambient interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a novel approach to natural demonstration of manipulation skills for multimodal interactive robots. The main focus is on the natural demonstration of manipulation skills, especially grasping skills. In order to teach grasping skills to a multimodal interactive robot, a human instructor makes use of natural spoken language and grasping actions demonstrated to the robot. The proposed approach emphasizes on four different aspects of learning by demonstration: First, the dialog system for processing natural speech is considered. Second, an object detection and classification scheme for the robot is shown. Third, the correspondence problem is addressed by an algorithm for visual tracking of the demonstrator's hands in real time and the transformation of the tracking results into an approach trajectory for a robotic arm. The fourth aspect addresses the fine-tuning of the robot's hand configuration for each grasp. It introduces a criterion to evaluate a grasp for stability and possible reuse of a grasped object. The approach produces stable grasps and is applied and evaluated on a multimodal service robot.