A drive-reinforcement model of single neuron function: An alternative to the Hebbian neuronal model
AIP Conference Proceedings 151 on Neural Networks for Computing
AIP Conference Proceedings 151 on Neural Networks for Computing
Connectionist learning procedures
Artificial Intelligence
Adaptation and decorrelation in the cortex
The computing neuron
Learning invariance from transformation sequences
Neural Computation
Slow feature discriminant analysis and its application on handwritten digit recognition
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Nonlinear dimensionality reduction using a temporal coherence principle
Information Sciences: an International Journal
Hi-index | 0.00 |
I describe a local synaptic learning rule that can be used to remove the effects of certain types of systematic temporal variation in the inputs to a unit. According to this rule, changes in synaptic weight result from a conjunction of short-term temporal changes in the inputs and the output. Formally, This is like the differential rule proposed by Klopf (1986) and Kosko (1986), except for a change of sign, which gives it an anti-Hebbian character. By itself this rule is insufficient. A weight conservation condition is needed to prevent the weights from collapsing to zero, and some further constraint implemented here by a biasing term to select particular sets of weights from the subspace of those which give minimal variation. As an example, I show that this rule will generate center-surround receptive fields that remove temporally varying linear gradients from the inputs.