Space Perception through Visuokinesthetic Prediction

  • Authors:
  • Wolfram Schenck

  • Affiliations:
  • Computer Engineering Group, Faculty of Technology, Bielefeld University, Bielefeld, Germany

  • Venue:
  • Anticipatory Behavior in Adaptive Learning Systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A model of visual space perception within the framework of the "perception through anticipation" approach is proposed. In this model, objects are localized by generating a simulated sequence of motor commands which would move the end effector of the agent from its current location to a location where it touches the object. Space perception arises whenever the agent knows how to move to the object. The main components of the model are a visuokinesthetic forward model for sensory prediction and a visual memory for novelty detection. Movement sequences are generated by the optimization method "differential evolution". The approach was implemented and successfully tested on a robot arm setup in the domain of block pushing on a table surface. The results indicate that visuokinesthetic prediction is superior to purely visual prediction for an iterative internal simulation of future sensory states. Furthermore, it is demonstrated that the generated movement sequences encode the location of the target object in a straightforward way.