A mathematical analysis of the effects of hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

  • Authors:
  • Benoî/t Siri;Hugues Berry;Bruno Cessac;Bruno Delord;Mathias Quoy

  • Affiliations:
  • Team Alchemy, INRIA, Parc Club Orsay Université/, 91893 Orsay Cedex, France. benoit.siri@gmail.com;Team Alchemy, INRIA, Parc Club Orsay Université/, 91893 Orsay Cedex, France. hugues.berry@inria.fr;Team Odyssee, INRIA, 06902 Sophia Antipolis, France/ Université/ de Nice, Parc Valrose, 06000 Nice, France/ and Institut Non Liné/aire de Nice, UMR 6618 CNRS, 06560 Valbonne, France. bruno ...;ANIM, U742 INSERM, Université/ P.M. Curie, 75005 Paris, France. bruno.delord@snv.jussieu.fr;ETIS, UMR 8051 CNRS-Université/ de Cergy-Pontoise-ENSEA, 95014 Cergy-Pontoise Cedex, France. quoy@ensea.fr

  • Venue:
  • Neural Computation
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.