The role of constraints in Hebbian learning
Neural Computation
Neural Computation
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Spiking Neuron Models: An Introduction
Spiking Neuron Models: An Introduction
Neural Computation
Spike-Timing-Dependent Hebbian Plasticity as Temporal Difference Learning
Neural Computation
Bayesian spiking neurons i: Inference
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
Gradient calculations for dynamic recurrent neural networks: a survey
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Identifying, formalizing, and combining biological mechanisms that implement known brain functions, such as prediction, is a main aspect of research in theoretical neuroscience. In this letter, the mechanisms of spike-timing-dependent plasticity and homeostatic plasticity, combined in an original mathematical formalism, are shown to shape recurrent neural networks into predictors. Following a rigorous mathematical treatment, we prove that they implement the online gradient descent of a distance between the network activity and its stimuli. The convergence to an equilibrium, where the network can spontaneously reproduce or predict its stimuli, does not suffer from bifurcation issues usually encountered in learning in recurrent neural networks.