Introduction to the theory of neural computation
Introduction to the theory of neural computation
Elements of information theory
Elements of information theory
The BSB model: a simple nonlinear autoassociative neural network
Associative neural memories
Sparse Distributed Memory
Neural Assemblies, an Alternative Approach to Artificial Intelligence
Neural Assemblies, an Alternative Approach to Artificial Intelligence
Mixed Mode VLSI Implementation of a Neural Associative Memory
Analog Integrated Circuits and Signal Processing
Controlling the Speed of Synfire Chains
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
Computing and stability in cortical networks
Neural Computation
Perceptrons: An Introduction to Computational Geometry
Perceptrons: An Introduction to Computational Geometry
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Memory Capacity of Balanced Networks
Neural Computation
An RCE-based Associative Memory with Application to Human Face Recognition
Neural Processing Letters
Multi-index hashing for information retrieval
SFCS '94 Proceedings of the 35th Annual Symposium on Foundations of Computer Science
Neural associative memory for brain modeling and information retrieval
Information Processing Letters - Special issue on applications of spiking neural networks
Optimal matrix compression yields storage capacity 1 for binary Willshaw associative memory
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Bayesian retrieval in associative memories with storage errors
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Neural associative memory with optimal bayesian learning
Neural Computation
Neural associative memories and sparse coding
Neural Networks
Integer sparse distributed memory: Analysis and results
Neural Networks
Hi-index | 0.00 |
Neural associative networks with plastic synapses have been proposed as computational models of brain functions and also for applications such as pattern recognition and information retrieval. To guide biological models and optimize technical applications, several definitions of memory capacity have been used to measure the efficiency of associative memory. Here we explain why the currently used performance measures bias the comparison between models and cannot serve as a theoretical benchmark. We introduce fair measures for information-theoretic capacity in associative memory that also provide a theoretical benchmark. In neural networks, two types of manipulating synapses can be discerned: synaptic plasticity, the change in strength of existing synapses, and structural plasticity, the creation and pruning of synapses. One of the new types of memory capacity we introduce permits quantifying how structural plasticity can increase the network efficiency by compressing the network structure, for example, by pruning unused synapses. Specifically, we analyze operating regimes in the Willshaw model in which structural plasticity can compress the network structure and push performance to the theoretical benchmark. The amount C of information stored in each synapse can scale with the logarithm of the network size rather than being constant, as in classical Willshaw and Hopfield nets (≤ ln 2 ≈ 0.7). Further, the review contains novel technical material: a capacity analysis of the Willshaw model that rigorously controls for the level of retrieval quality, an analysis for memories with a nonconstant number of active units (where C ≤ 1/eln 2 ≈ 0.53), and the analysis of the computational complexity of associative memories with and without network compression.