Combining visual attention, object recognition and associative information processing in a neurobotic system

  • Authors:
  • Rebecca Fay;Ulrich Kaufmann;Andreas Knoblauch;Heiner Markert;Günther Palm

  • Affiliations:
  • Department of Neural Information Processing, University of Ulm, Ulm, Germany;Department of Neural Information Processing, University of Ulm, Ulm, Germany;Department of Neural Information Processing, University of Ulm, Ulm, Germany;Department of Neural Information Processing, University of Ulm, Ulm, Germany;Department of Neural Information Processing, University of Ulm, Ulm, Germany

  • Venue:
  • Biomimetic Neural Learning for Intelligent Robots
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have implemented a neurobiologically plausible system on a robot that integrates visual attention, object recognition, language and action processing using a coherent cortex-like architecture based on neural associative memories. This system enables the robot to respond to spoken commands like ”bot show plum” or ”bot put apple to yellow cup”. The scenario for this is a robot close to one or two tables carrying certain kinds of fruit and other simple objects. Tasks such as finding and pointing to certain fruits in a complex visual scene according to spoken or typed commands can be demonstrated. This involves parsing and understanding of simple sentences, relating the nouns to concrete objects sensed by the camera, and coordinating motor output with planning and sensory processing.