Learning sensory representations with intrinsic plasticity
Neurocomputing
Synergies Between Intrinsic and Synaptic Plasticity Mechanisms
Neural Computation
Delay learning and polychronization for reservoir computing
Neurocomputing
Improving reservoirs using intrinsic plasticity
Neurocomputing
A Globally Asymptotically Stable Plasticity Rule for Firing Rate Homeostasis
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Learning inverse kinematics for pose-constraint bi-manual movements
SAB'10 Proceedings of the 11th international conference on Simulation of adaptive behavior: from animals to animats
Improving recurrent neural network performance using transfer entropy
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
Intrinsic adaptation in autonomous recurrent neural networks
Neural Computation
A self-organized neural comparator
Neural Computation
Hi-index | 0.00 |
While synaptic learning mechanisms have always been a core topic of neural computation research, there has been relatively little work on intrinsic learning processes, which change a neuron’s excitability. Here, we study a single, continuous activation model neuron and derive a gradient rule for the intrinsic plasticity based on information theory that allows the neuron to bring its firing rate distribution into an approximately exponential regime, as observed in visual cortical neurons. In simulations, we show that the rule works efficiently.