Universally manipulable body models--dual quaternion representations in layered and dynamic MMCs

  • Authors:
  • Malte Schilling

  • Affiliations:
  • Center of Excellence `Cognitive Interaction Technology' (CITEC), University of Bielefeld, Bielefeld, Germany 33501 and International Computer Science Institute (ICSI), Berkeley, USA

  • Venue:
  • Autonomous Robots
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Surprisingly complex tasks can be solved using a behaviour-based, reactive control system, i.e., a system that operates without an explicit internal representation of the environment and the own body. Nevertheless, application of internal representations has gained interest in recent years because such internal representations can be used to solve problems of perception and motor control (sensor fusion, inverse modeling) and may in addition be applied to higher cognitive functions as are the ability to plan ahead. To endow such a system with the ability to find new behavioural solutions to a given problem in a broad range of possibilities, the internal representation must be universally manipulable, i.e. the model should be able to simulate all movements that are physically possible for the body given. Using recurrent neural networks, models showing this faculty have been proposed being based on the principle of mean of multiple computation (MMC). The extension of this approach to three dimensions requires the introduction of a joint angle representation which allows for computation of mean values. Here we use dual quaternions that are singularity-free and unambiguous which allow for shortest path interpolation. In addition, it has been shown that dual quaternions are the most efficient and most compact form for representing rigid transformations. The model can easily be adapted to bodies of arbitrary geometries. The extended MMC net introduced in this article represents a holistic system that can--following the principle of pattern completion--likewise be used as an inverse model, a forward model, for sensor fusion or other, related capabilities.