Modelling mental rotation in cognitive robots

  • Authors:
  • Kristsana Seepanomwan;Daniele Caligiore;Gianluca Baldassarre;Angelo Cangelosi

  • Affiliations:
  • Plymouth University, Plymouth, UK;Laboratory of Computational Embodied Neuroscience, Istituto di Scienze e Tecnologie della Cognizione, Consiglio Nazionale delle Ricerche (LOCEN-ISTC-CNR), Rome, Italy;Laboratory of Computational Embodied Neuroscience, Istituto di Scienze e Tecnologie della Cognizione, Consiglio Nazionale delle Ricerche (LOCEN-ISTC-CNR), Rome, Italy;Plymouth University, Plymouth, UK

  • Venue:
  • Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Mental rotation concerns the cognitive processes that allow an agent mentally to rotate the image of an object in order to solve a given task, for example to say if two objects with different orientations are the same or different. Here we present a system-level bio-constrained model, developed within a neurorobotics framework, that provides an embodied account of mental rotation processes relying on neural mechanisms involving motor affordance encoding, motor simulation and the anticipation of the sensory consequences of actions (both visual and proprioceptive). This model and methodology are in agreement with the most recent theoretical and empirical research on mental rotation. The model was validated through experiments with a simulated humanoid robot (iCub) engaged in solving a classical mental rotation test. The results of the test show that the robot is able to solve the task and, in agreement with data from psychology experiments, exhibits response times linearly dependent on the angular disparity between the objects. This model represents a novel detailed operational account of the embodied brain mechanisms that may underlie mental rotation.