Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays

  • Authors:
  • Jinde Cao;Jun Wang

  • Affiliations:
  • Department of Mathematics, Southeast University, Nanjing 210096 Jiangsu, China;Department of Automation and Computer-Aided Engineering, The Chinese University of Hong Kong Shatin, New Territories, Hong Kong, China

  • Venue:
  • Neural Networks
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper investigates the absolute exponential stability of a general class of delayed neural networks, which require the activation functions to be partially Lipschitz continuous and monotone nondecreasing only, but not necessarily differentiable or bounded. Three new sufficient conditions are derived to ascertain whether or not the equilibrium points of the delayed neural networks with additively diagonally stable interconnection matrices are absolutely exponentially stable by using delay Halanay-type inequality and Lyapuno v function. The stability criteria are also suitable for delayed optimization neural networks and delayed cellular neural networks whose activation functions are often nondifferentiable or unbounded. The results herein answer a question: if a neural network without any delay is absolutely exponentially stable, then under what additional conditions, the neural networks with delay is also absolutely exponentially stable.