Global Stability of a General Class of Discrete-Time Recurrent Neural Networks

  • Authors:
  • Zhigang Zeng;De-Shuang Huang;Zengfu Wang

  • Affiliations:
  • Aff1 Aff2;Intelligent Computing Lab, Hefei Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China 230031;Department of Automation, University of Science and Technology of China, Hefei, China 230026

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A general class of discrete-time recurrent neural networks (DTRNNs) is formulated and studied in this paper. Several sufficient conditions are obtained to ensure the global stability of DTRNNs with delays based on induction principle (not based on the well-known Liapunov methods). The obtained results have neither assumed the symmetry of the connection matrix, nor boundedness, monotonicity or the differentiability of the activation functions. In addition, discrete-time analogues of a general class of continuous-time recurrent neural networks (CTRNNs) are derived and studied. The convergence characteristics of CTRNNs are preserved by the discrete-time analogues without any restriction imposed on the uniform discretization step size. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.