Learning robot actions based on self-organising language memory

  • Authors:
  • Stefan Wermter;Mark Elshaw

  • Affiliations:
  • Centre for Hybrid Intelligent Systems, School of Computing and Technology, University of Sunderland, St Peter's Way, Sunderland, SR6 0DD, United Kingdom;Centre for Hybrid Intelligent Systems, School of Computing and Technology, University of Sunderland, St Peter's Way, Sunderland, SR6 0DD, United Kingdom

  • Venue:
  • Neural Networks - 2003 Special issue: Advances in neural networks research — IJCNN'03
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the MirrorBot project we examine perceptual processes using models of cortical assemblies and mirror neurons to explore the emergence of semantic representations of actions, percepts and concepts in a neural robot. The hypothesis under investigation is whether a neural model will produce a life-like perception system for actions. In this context we focus in this paper on how instructions for actions can be modeled in a self-organising memory. Current approaches for robot control often do not use language and ignore neural learning. However, our approach uses language instruction and draws from the concepts of regional distributed modularity, self-organisation and neural assemblies. We describe a self-organising model that clusters actions into different locations depending on the body part they are associated with. In particular, we use actual sensor readings from the MIRA robot to represent semantic features of the action verbs. Furthermore, we outline a hierarchical computational model for a self-organising robot action control system using language for instruction.