Global exponential convergence of recurrent neural networks with variable delays

  • Authors:
  • Zhang Yi

  • Affiliations:
  • School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, People's Republic of China

  • Venue:
  • Theoretical Computer Science
  • Year:
  • 2004

Quantified Score

Hi-index 5.23

Visualization

Abstract

Convergence analysis of recurrent neural networks is an important research direction in the field of neural networks. Novel methods to study the global exponential convergence of recurrent neural networks with variable delays are proposed. A condition for global exponential stability, which is independent of the delays, is derived by the method of delayed inequalities analysis. Another condition for global exponential stability, which depends on the delays, is obtained via the method of constructing a suitable and interesting Lyapunov functional.