Hebbian synaptic plasticity: comparative and developmental aspects
The handbook of brain theory and neural networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Spiking Neuron Models: An Introduction
Spiking Neuron Models: An Introduction
Synapses as dynamic memory buffers
Neural Networks
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Intrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning
Neural Computation
A distributed and multithreaded neural event driven simulation framework
PDCN'06 Proceedings of the 24th IASTED international conference on Parallel and distributed computing and networks
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Spike-timing error backpropagation in theta neuron networks
Neural Computation
Competitive stdp-based spike pattern learning
Neural Computation
On the relation between bursts and dynamic synapse properties: a modulation-based Ansatz
Computational Intelligence and Neuroscience
Adaptive synchronization of activities in a recurrent network
Neural Computation
Point process models for event-based speech recognition
Speech Communication
SWAT: a spiking neural network training algorithm for classification problems
IEEE Transactions on Neural Networks
Phase precession and recession with STDP and Anti-STDP
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
SPAN: a neuron for precise-time spike pattern association
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
A new learning algorithm for adaptive spiking neural networks
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
WCCI'12 Proceedings of the 2012 World Congress conference on Advances in Computational Intelligence
Neural Processing Letters
Supervised learning in multilayer spiking neural networks
Neural Computation
Advances in Artificial Neural Systems - Special issue on Advances in Unsupervised Learning Techniques Applied to Biosciences and Medicine
Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks
Journal of Computational Neuroscience
A new supervised learning algorithm for spiking neurons
Neural Computation
Computational Intelligence and Neuroscience
Spike-timing-dependent construction
Neural Computation
Spatio-temporal spike pattern classification in neuromorphic systems
Living Machines'13 Proceedings of the Second international conference on Biomimetic and Biohybrid Systems
Hi-index | 0.00 |
Spiking neurons are very flexible computational modules, which can implement with different values of their adjustable synaptic parameters an enormous variety of different transformations F from input spike trains to output spike trains. We examine in this letter the question to what extent a spiking neuron with biologically realistic models for dynamic synapses can be taught via spike-timing-dependent plasticity (STDP) to implement a given transformation F. We consider a supervised learning paradigm where during training, the output of the neuron is clamped to the target signal (teacher forcing). The well-known perceptron convergence theorem asserts the convergence of a simple supervised learning algorithm for drastically simplified neuron models (McCulloch-Pitts neurons). We show that in contrast to the perceptron convergence theorem, no theoretical guarantee can be given for the convergence of STDP with teacher forcing that holds for arbitrary input spike patterns. On the other hand, we prove that average case versions of the perceptron convergence theorem hold for STDP in the case of uncorrelated and correlated Poisson input spike trains and simple models for spiking neurons. For a wide class of cross-correlation functions of the input spike trains, the resulting necessary and sufficient condition can be formulated in terms of linear separability, analogously as the well-known condition of learnability by perceptrons. However, the linear separability criterion has to be applied here to the columns of the correlation matrix of the Poisson input. We demonstrate through extensive computer simulations that the theoretically predicted convergence of STDP with teacher forcing also holds for more realistic models for neurons, dynamic synapses, and more general input distributions. In addition, we show through computer simulations that these positive learning results hold not only for the common interpretation of STDP, where STDP changes the weights of synapses, but also for a more realistic interpretation suggested by experimental data where STDP modulates the initial release probability of dynamic synapses.