On the asymptotic information storage capacity of neural networks
Neural Computers
Modeling brain function—the world of attractor neural networks
Modeling brain function—the world of attractor neural networks
The role of constraints in Hebbian learning
Neural Computation
The role of weight normalization in competitive learning
Neural Computation
Memory mainetenance via neuronal regulation
Neural Computation
Synaptic pruning in development: a computational account
Neural Computation
Associative memory with dynamic synapses
Neural Computation
Homeostatic synaptic scaling in self-organizing maps
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Inhomogeneities in heteroassociative memories with linear learning rules
Neural Computation
Memory dynamics in attractor networks with saliency weights
Neural Computation
Neural associative memory with optimal bayesian learning
Neural Computation
Hi-index | 0.00 |
In this article we revisit the classical neuroscience paradigm of Hebbian learning. We find that it is difficult to achieve effective associative memory storage by Hebbian synaptic learning, since it requires network-level information at the synaptic level or sparse coding level. Effective learning can yet be achieved even with nonsparse patterns by a neuronal process that maintains a zero sum of the incoming synaptic efficacies. This weight correction improves the memory capacity of associative networks from an essentially bounded one to a memory capacity that scales linearly with network size. It also enables the effective storage of patterns with multiple levels of activity within a single network. Such neuronal weight correction can be successfully carried out by activity-dependent homeostasis of the neuron's synaptic efficacies, which was recently observed in cortical tissue. Thus, our findings suggest that associative learning by Hebbian synaptic learning should be accompanied by continuous remodeling of neuronally driven regulatory processes in the brain.