Deterministic Boltzmann learning performs steepest descent in weight-space
Neural Computation
Learning and relearning in Boltzmann machines
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Setting the activity level in sparse random networks
Neural Computation
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Optimal decision network with distributed representation
Neural Networks
2008 Special Issue: The Emergent neural modeling system
Neural Networks
Hi-index | 0.00 |
We present a new learning algorithm that leverages oscillations in the strength of neural inhibition to train neural networks. Raising inhibition can be used to identify weak parts of target memories, which are then strengthened. Conversely, lowering inhibition can be used to identify competitors, which are then weakened. To update weights, we apply the Contrastive Hebbian Learning equation to successive time steps of the network. The sign of the weight change equation varies as a function of the phase of the inhibitory oscillation. We show that the learning algorithm can memorize large numbers of correlated input patterns without collapsing and that it shows good generalization to test patterns that do not exactly match studied patterns.