Dynamics and storage capacity of neural networks with small-world topology

  • Authors:
  • Silvia Scarpetta;Antonio De Candia;Ferdinando Giacco

  • Affiliations:
  • Dipartimento di Fisica “E. R. Caianiello”, Università di Salerno, Italy and INFN, Sezione di Napoli e Gruppo Coll. di Salerno;Dipartimento di Scienze Fisiche, Università di Napoli Federico II and INFN, Sezione di Napoli e Gruppo Coll. di Salerno and CNR-SPIN, Unità di Napoli;Dipartimento di Fisica “E. R. Caianiello”, Università di Salerno, Italy and INFN, Sezione di Napoli e Gruppo Coll. di Salerno

  • Venue:
  • Proceedings of the 2011 conference on Neural Nets WIRN10: Proceedings of the 20th Italian Workshop on Neural Nets
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the storage of phase-coded patterns as stable dynamical attractors in recurrent spin neural networks with small-world topology. The synaptic strength of existent connections is determined by a learning rule based on spike-time-dependent plasticity (STDP), with an asymmetric time window depending on the relative timing between pre-and post-synaptic activity. We store multiple patterns and study the network capacity in sparse networks with different topologies. We study networks where each neuron is connected only to a small number z « N of other neurons. Connections can be short range, between neighboring neurons placed on a regular lattice, or long range, between randomly chosen pairs of neurons. We find that a small fraction of long range connections is able to amplify the capacity of the network. This imply that a small-world-network topology maybe optimal, as a compromise between the cost of long range connections and the capacity increase.