On the asymptotic information storage capacity of neural networks
Neural Computers
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Elements of information theory
Elements of information theory
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Sparse Distributed Memory
Neural Assemblies, an Alternative Approach to Artificial Intelligence
Neural Assemblies, an Alternative Approach to Artificial Intelligence
Semi-Naive Bayesian Classifier
EWSL '91 Proceedings of the European Working Session on Machine Learning
Attractor Neural Networks with Hypercolumns
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Information Retrieval and Categorisation using a Cell Assembly Network
Neural Computing and Applications
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Effective Neuronal Learning with Ineffective Hebbian Learning Rules
Neural Computation
An RCE-based Associative Memory with Application to Human Face Recognition
Neural Processing Letters
Inhomogeneities in heteroassociative memories with linear learning rules
Neural Computation
Optimal plasticity from matrix memories: What goes up must come down
Neural Computation
Multi-index hashing for information retrieval
SFCS '94 Proceedings of the 35th Annual Symposium on Foundations of Computer Science
IEEE Transactions on Computers
Memory capacities for synaptic and structural plasticity
Neural Computation
Neural associative memory for brain modeling and information retrieval
Information Processing Letters - Special issue on applications of spiking neural networks
Cell assemblies for diagnostic problem-solving
Neurocomputing
Optimal matrix compression yields storage capacity 1 for binary Willshaw associative memory
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Bayesian retrieval in associative memories with storage errors
IEEE Transactions on Neural Networks
Neural Computation
Neural associative memories and sparse coding
Neural Networks
Integer sparse distributed memory: Analysis and results
Neural Networks
Hi-index | 0.00 |
Neural associative memories are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Previous work optimized the memory capacity for various models of synaptic learning: linear Hopfield-type rules, the Willshaw model employing binary synapses, or the BCPNN rule of Lansner and Ekeberg, for example. Here I show that all of these previous models are limit cases of a general optimal model where synaptic learning is determined by probabilistic Bayesian considerations. Asymptotically, for large networks and very sparse neuron activity, the Bayesian model becomes identical to an inhibitory implementation of the Willshaw and BCPNN-type models. For less sparse patterns, the Bayesian model becomes identical to Hopfield-type networks employing the covariance rule. For intermediate sparseness or finite networks, the optimal Bayesian learning rule differs from the previous models and can significantly improve memory performance. I also provide a unified analytical framework to determine memory capacity at a given output noise level that links approaches based on mutual information, Hamming distance, and signal-to-noise ratio.