The role of constraints in Hebbian learning
Neural Computation
Spike-Timing-Dependent Hebbian Plasticity as Temporal Difference Learning
Neural Computation
A Review of the Integrate-and-fire Neuron Model: I. Homogeneous Synaptic Input
Biological Cybernetics
2005 Special issue: Interpreting hippocampal function as recoding and forecasting
Neural Networks - Special issue: Computational theories of the functions of the hippocampus
Phase precession through synaptic facilitation
Neural Computation
Spiking neuron model for temporal sequence recognition
Neural Computation
Hi-index | 0.00 |
A spiking neural network that learns temporal sequences is described. A sparse code in which individual neurons represent sequences and subsequences enables multiple sequences to be stored without interference. The network is founded on a model of sequence compression in the hippocampus that is robust to variation in sequence element duration and well suited to learn sequences through spike-timing dependent plasticity (STDP). Three additions to the sequence compression model underlie the sparse representation: synapses connecting the neurons of the network that are subject to STDP, a competitive plasticity rule so that neurons specialize to individual sequences, and neural depolarization after spiking so that neurons have a memory. The response to new sequence elements is determined by the neurons that have responded to the previous subsequence, according to the competitively learned synaptic connections. Numerical simulations show that the model can learn sets of intersecting sequences, presented with widely differing frequencies, with elements of varying duration.