Characterization of periodic attractors in neural ring networks
Neural Networks
Spikes: exploring the neural code
Spikes: exploring the neural code
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Bayesian computation in recurrent neural circuits
Neural Computation
Real-time computation at the edge of chaos in recurrent neural networks
Neural Computation
A computational framework for cortical learning
Biological Cybernetics
What Can a Neuron Learn with Spike-Timing-Dependent Plasticity?
Neural Computation
On the Computational Power of Winner-Take-All
Neural Computation
A gradient rule for the plasticity of a neuron’s intrinsic excitability
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Delay learning and polychronization for reservoir computing
Neurocomputing
Improving reservoirs using intrinsic plasticity
Neurocomputing
Predictive Coding in Cortical Microcircuits
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Optimizing Generic Neural Microcircuits through Reward Modulated STDP
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Genetic algorithm for reservoir computing optimization
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
Hi-index | 0.00 |
We investigate how different forms of plasticity shape the dynamics and computational properties of simple recurrent spiking neural networks. In particular, we study the effect of combining two forms of neuronal plasticity: spike timing dependent plasticity (STDP), which changes the synaptic strength, and intrinsic plasticity (IP), which changes the excitability of individual neurons to maintain homeostasis of their activity. We find that the interaction of these forms of plasticity gives rise to interesting network dynamics characterized by a comparatively large number of stable limit cycles. We study the response of such networks to external input and find that they exhibit a fading memory of recent inputs. We then demonstrate that the combination of STDP and IP shapes the network structure and dynamics in ways that allow the discovery of patterns in input time series and lead to good performance in time series prediction. Our results underscore the importance of studying the interaction of different forms of plasticity on network behavior.