Associative memory by recurrent neural networks with delay elements

  • Authors:
  • Seiji Miyoshi;Hiro-Fumi Yanai;Masato Okada

  • Affiliations:
  • Department of Electronic Engineering, Kobe City College of Technology, 8-3 Gakuen-Higashimachi, Nishi-ku, Köbe 651-2194, Japan;Department of Media and Telecommunications, Faculty of Engineering, Ibaraki University, Naka-Narusawa, Hitachi, Ibaraki 316-8511, Japan;Explor. Res. for Adv. Tech., Japan Sci. and Tech. Corp., Kyoto and Lab. for Math. Neuros., RIKEN Brain Sci. Inst., Wako, Saitama 351-0198 and Intell. Coop. and Ctrl., PRESTO, JS&TC, Saitama 351-01 ...

  • Venue:
  • Neural Networks
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper, we first re-derive the Yanai-Kim theory, which involves macrodynamical equations for the dynamics of the network with serial delay elements. Since their theory needs a computational complexity of O(L4t) to obtain the macroscopic state at time step t where L is the length of delay, it is intractable to discuss the macroscopic properties for a large L limit. Thus, we derive steady state equations using the discrete Fourier transformation, where the computational complexity does not formally depend on L. We show that the storage capacity αC is in proportion to the delay length L with a large L limit, and the proportion constant is 0.195, i.e. αC = 0.195L. These results are supported by computer simulations.