Lower bounds on the capacities of binary and ternary networks storing sparse random vectors

  • Authors:
  • Y. Baram;D. Sal'ee

  • Affiliations:
  • Technion-Israel Inst. of Technol., Haifa;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

It is shown that the memory capacity of networks of binary neurons storing, by the Hebbian rule, sparse random vectors over the field {0, 1}N is at least c(N/p log N ), where c is a positive scalar involving input error probabilities probability of an element being nonzero. A similar bound is derived for networks of ternary neurons, storing sparse vectors over {-1,0,1}N. These results, pertaining to stability and error correction with probability tending to one as the number of neurons tends to infinity, generalize and extend previously known capacity bounds for binary networks storing vectors of equally probable {±1} bits. Lower bounds on the capacities of binary and ternary networks of finite sizes are also derived. These bounds suggest critical network sizes that guarantee high gains in capacity per neuron for given sparsites