Elements of information theory
Elements of information theory
Independent component analysis by general nonlinear Hebbian-like learning rules
Signal Processing - Special issue on neural networks
Spiking Neuron Models: An Introduction
Spiking Neuron Models: An Introduction
What Can a Neuron Learn with Spike-Timing-Dependent Plasticity?
Neural Computation
Rules for information maximization in spiking neurons using intrinsic plasticity
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A spiking neuron as information bottleneck
Neural Computation
Independent complexity patterns in single neuron activity induced by static magnetic field
Computer Methods and Programs in Biomedicine
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
Hi-index | 0.00 |
Independent component analysis (or blind source separation) is assumed to be an essential component of sensory processing in the brain and could provide a less redundant representation about the external world. Another powerful processing strategy is the optimization of internal representations according to the information bottleneck method. This method would allow extracting preferentially those components from high-dimensional sensory input streams that are related to other information sources, such as internal predictions or proprioceptive feedback. However, there exists a lack of models that could explain how spiking neurons could learn to execute either of these two processing strategies. We show in this article how stochastically spiking neurons with refractoriness could in principle learn in an unsupervised manner to carry out both information bottleneck optimization and the extraction of independent components. We derive suitable learning rules, which extend the well-known BCM rule, from abstract information optimization principles. These rules will simultaneously keep the firing rate of the neuron within a biologically realistic range.