A Quantitative Comparison of the Performance of Three Discrete Distributed Associative Memory Models
IEEE Transactions on Computers
Basins of attraction of neural network models
AIP Conference Proceedings 151 on Neural Networks for Computing
The capacity of the Hopfield associative memory
IEEE Transactions on Information Theory
Part segmentation for object recognition
Neural Computation
Hi-index | 0.00 |
A statistical method is applied to explore the unique characteristics of a certain class of neural network autoassociative memory with N neurons and first-order synaptic interconnections. The memory matrix is constructed to store M = N vectors based on the outer-product learning algorithm. We theoretically prove that, by setting all the diagonal terms of the memory matrix to be M and letting the input error ratio = 0, the probability of successful recall Pr steadily decreases as increases, but as increases past 1.0, Pr begins to increase slowly. When 0 Pr 0.99, the tradeoff between the number of stable states and their attraction force is analyzed and the memory capacity is shown to be 0.15N at best.