Simulation of chaotic EEG patterns with a dynamic model of the olfactory system
Biological Cybernetics
Convergent activation dynamics in continuous time networks
Neural Networks
Computation at the edge of chaos: phase transitions and emergent computation
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Weakly connected neural networks
Weakly connected neural networks
Structure and Dynamics of Random Recurrent Neural Networks
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Intrinsic adaptation in autonomous recurrent neural networks
Neural Computation
Learning rule of homeostatic synaptic scaling: Presynaptic dependent or not
Neural Computation
Hi-index | 0.00 |
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.