The capacity of the Hopfield associative memory

  • Authors:
  • R. J. McEliece;E. C. Posner;E. R. Rodemich;S. S. Venkatesh

  • Affiliations:
  • -;-;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 1987

Quantified Score

Hi-index 754.84

Visualization

Abstract

Techniques from coding theory are applied to study rigorously the capacity of the Hopfield associative memory. Such a memory storesn-tuple ofpm 1's. The components change depending on a hard-limited version of linear functions of all other components. With symmetric connections between components, a stable state is ultimately reached. By building up the connection matrix as a sum-of-outer products ofmfundamental memories, one hopes to be able to recover a certain one of themmemories by using an initialn-tuple probe vector less than a Hamming distancen/2away from the fundamental memory. Ifmfundamental memories are chosen at random, the maximum asympotic value ofmin order that most of themoriginal memories are exactly recoverable isn/(2 log n). With the added restriction that every one of themfundamental memories be recoverable exactly,mcan be no more thann/(4 log n)asymptotically asnapproaches infinity. Extensions are also considered, in particular to capacity under quantization of the outer-product connection matrix. This quantized memory capacity problem is closely related to the capacity of the quantized Gaussian channel.