Structure and Dynamics of Random Recurrent Neural Networks

  • Authors:
  • Hugues Berry;Mathias Quoy

  • Affiliations:
  • INRIA Futurs, France;ETIS UMR8051, France

  • Venue:
  • Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Contrary to Hopfield-like networks, random recurrent neural networks (RRNN), where the couplings are random, exhibit complex dynamics (limit cycles, chaos). It is possible to store information in these networks through Hebbian learning. Eventually, learning "destroys" the dynamics and leads to a fixed point attractor. We investigate here the structural changes occurring in the network through learning. We show that a simple Hebbian learning rule organizes synaptic weight redistribution on the network from an initial homogeneous and random distribution to a heterogeneous one, where strong synaptic weights preferentially assemble in triangles. Hence learning organizes the network of the large synaptic weights as a "small-world" one