Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays

  • Authors:
  • Boshan Chen;Jun Wang

  • Affiliations:
  • Department of Mathematics, Hubei Normal University, Huangshi, Hubei, 435002, China;Department of Mechanical & Automation Engineering, The Chinese University of Hong Kong, Shatin, New Territories, Hong Kong

  • Venue:
  • Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The paper presents theoretical results on the global exponential periodicity and global exponential stability of a class of recurrent neural networks with various general activation functions and time-varying delays. The general activation functions include monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions. For each class of activation functions, testable algebraic criteria for ascertaining global exponential periodicity and global exponential stability of a class of recurrent neural networks are derived by using the comparison principle and the theory of monotone operator. Furthermore, the rate of exponential convergence and bounds of attractive domain of periodic oscillations or equilibrium points are also estimated. The convergence analysis based on the generalization of activation functions widens the application scope for the model design of neural networks. In addition, the new effective analytical method enriches the toolbox for the qualitative analysis of neural networks.