Large memory capacity in chaotic artificial neural networks: a view of the anti-integrable limit

  • Authors:
  • Wei Lin;Guanrong Chen

  • Affiliations:
  • Key Lab. of Mathematics for Nonlinear Sci., Fudan Univ., Chinese Ministry of Education, China and Sch. of Mathematical Sci. and the Centre for Comp. Sys. Biology, Fudan Univ., Shanghai, China and ...;Department of Electronic Engineering, City University of Hong Kong, Kowloon, Hong Kong

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the antiintegrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model's memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models.