Simultaneously emerging Braitenberg codes and compositionality

  • Authors:
  • Yuuya Sugita;Jun Tani;Martin V Butz

  • Affiliations:
  • Department of Psychology (Cognitive Psychology), Universityof Würzburg, Germany;RIKEN Brain Science Institute, Japan;Department of Psychology (Cognitive Psychology), Universityof Würzburg, Germany

  • Venue:
  • Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Although many researchers have suggested that compositional concepts should be sensorimotor grounded, the method to accomplish this remains unclear. This article introduces a second-order neural network with parametric biases (sNNPB) that learns compositional structures based on sensorimotor time series data. The data was produced by a simulated robot that executed distinct object interactions (move-to and orient-toward). We show that various sNNPB setups can learn to compositionally imitate object interactions beyond the interactions that were specifically trained, which was not possible with previous neural network (NN) architectures, including recurrent neural networks (RNNs). We also show that these imitation capabilities are accomplished by developing a self-organized, geometrically arranged compositional concept structure in the PB values and task-oriented, Braitenberg-like sensory encodings in hidden sensory layers. Because second-order connections were necessary to accomplish the task, we hypothesize that such connections may be essential to drive the learning of both sensorimotor-grounded compositional structures and Braitenberg-like, behavior-oriented ''pro-presentations.'' From a cognitive perspective, we show how sensorimotor time series of interactions may be processed to generate the signals necessary to develop semantically compositional structures.