Vision-Motor Abstraction toward Robot Cognition

  • Authors:
  • Fady Alnajjar;Abdul Rahman Hafiz;Indra Bin Zin;Kazuyuki Murase

  • Affiliations:
  • Graduate School of Engineering, University of Fukui, Japan;Graduate School of Engineering, University of Fukui, Japan;Graduate School of Engineering, University of Fukui, Japan;Graduate School of Engineering, University of Fukui, Japan

  • Venue:
  • ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Based on indications from neuroscience and psychology, both perception and action can be internally simulated in organisms by activating sensory and/or motor areas in the brain without actual sensory input and/or without any resulting behavior. This phenomenon is usually used by the organisms to cope with missing external inputs. Applying such a phenomenon in a real robot recently has taken the attention of many researchers. Although some work has been reported on this issue, none of it has so far considered the potential of the robot's vision at the sensorimotor abstraction level, where extracting data from the environment to build the internal representation takes place. In this study, a novel vision-motor abstraction is presented into a physically robot through a memory-based learning algorithm. Experimental results indicate that our robot with its vision could develop a simple anticipation mechanism in its memory from the interacting with the environment. This mechanism could guide the robot behavior in the absence of external inputs.