Incremental Learning of Tasks From User Demonstrations, Past Experiences, and Vocal Comments

  • Authors:
  • M. Pardowitz;S. Knoop;R. Dillmann;R. D. Zollner

  • Affiliations:
  • Inst. of Comput. Sci. & Eng., Karlsruhe Univ.;-;-;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Since many years the robotics community is envisioning robot assistants sharing the same environment with humans. It became obvious that they have to interact with humans and should adapt to individual user needs. Especially the high variety of tasks robot assistants will be facing requires a highly adaptive and user-friendly programming interface. One possible solution to this programming problem is the learning-by-demonstration paradigm, where the robot is supposed to observe the execution of a task, acquire task knowledge, and reproduce it. In this paper, a system to record, interpret, and reason over demonstrations of household tasks is presented. The focus is on the model-based representation of manipulation tasks, which serves as a basis for incremental reasoning over the acquired task knowledge. The aim of the reasoning is to condense and interconnect the data, resulting in more general task knowledge. A measure for the assessment of information content of task features is introduced. This measure for the relevance of certain features relies both on general background knowledge as well as task-specific knowledge gathered from the user demonstrations. Beside the autonomous information estimation of features, speech comments during the execution, pointing out the relevance of features are considered as well. The results of the incremental growth of the task knowledge when more task demonstrations become available and their fusion with relevance information gained from speech comments is demonstrated within the task of laying a table