Improved storage capacity of hebbian learning attractor neural network with bump formations

  • Authors:
  • Kostadin Koroutchev;Elka Korutcheva

  • Affiliations:
  • EPS, Universidad Autónoma de Madrid, Cantoblanco, Madrid, Spain;Depto. de Física Fundamental, Universidad Nacional de Educación a Distancia, Madrid, Spain

  • Venue:
  • ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, bump formations in attractor neural networks with distance dependent connectivities has become of increasing interest for investigation in the field of biological and computational neuroscience. Although the distance dependent connectivity is common in biological networks, a common fault of these network is the sharp drop of the number of patterns p that can remembered, when the activity changes from global to bump-like, than effectively makes these networks low effective. In this paper we represent a bump-based recursive network specially designed in order to increase its capacity, which is comparable with that of randomly connected sparse network. To this aim, we have tested a selection of 700 natural images on a network with N = 64K neurons with connectivity per neuron C. We have shown that the capacity of the network is of order of C, that is in accordance with the capacity of highly diluted network. Preserving the number of connections per neuron, a non-trivial behavior with the radius of the connectivity has been observed. Our results show that the decrement of the capacity of the bumpy network can be avoided.