Automated cross-modal mapping in robotic eye/hand systems using plastic radial basis function networks

  • Authors:
  • Qinggang Meng;M. H. Lee

  • Affiliations:
  • Department of Computer Science, Loughborough University, UK;Department of Computer Science, University of Wales, Aberystwyth, UK

  • Venue:
  • Connection Science
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Advanced autonomous artificial systems will need incremental learning and adaptive abilities similar to those seen in humans. Knowledge from biology, psychology and neuroscience is now inspiring new approaches for systems that have sensory-motor capabilities and operate in complex environments. Eye/hand coordination is an important cross-modal cognitive function, and is also typical of many of the other coordinations that must be involved in the control and operation of embodied intelligent systems. This paper examines a biologically inspired approach for incrementally constructing compact mapping networks for eye/hand coordination. We present a simplified node-decoupled extended Kalman filter for radial basis function networks, and compare this with other learning algorithms. An experimental system consisting of a robot arm and a pan-and-tilt head with a colour camera is used to produce results and test the algorithms in this paper. We also present three approaches for adapting to structural changes during eye/hand coordination tasks, and the robustness of the algorithms under noise are investigated. The learning and adaptation approaches in this paper have similarities with current ideas about neural growth in the brains of humans and animals during tool-use, and infants during early cognitive development.