Asymmetric dynamics in optimal variance adaptation
Neural Computation
Spikes: exploring the neural code
Spikes: exploring the neural code
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Learning Beyond Finite Memory in Recurrent Networks of Spiking Neurons
Neural Computation
Spike-timing error backpropagation in theta neuron networks
Neural Computation
Supervised learning in multilayer spiking neural networks
Neural Computation
Hi-index | 0.00 |
We develop a learning rule for networks of spiking neurons where signals are encoded using fractionally predictive spike-coding. In this paradigm, neural output signals are encoded as a sum of shifted power-law kernels. Simple greedy thresholding can compute this encoding, and spike-trains are then exactly the signal's fractional derivative. Fractionally predictive spike-coding exploits natural statistics and is consistent with observed spike-rate adaptation in real neurons; its multiple-timescale properties also reconciles notions of spike-time coding and spike-rate coding. Previously, we argued that properly tuning the decoding kernel at receiving neurons can implement spectral filtering; the applicability to general temporal filtering was left open. Here, we present an error-backpropagation algorithm to learn these decoding filters, and we show that networks of fractionally predictive spiking neurons can then implement temporal filters such as delayed responses, delayed match-to-sampling, and temporal versions of the XOR problem.