Asymptotic stability for neural networks with mixed time-delays: The discrete-time case

  • Authors:
  • Yurong Liu;Zidong Wang;Xiaohui Liu

  • Affiliations:
  • Department of Mathematics, Yangzhou University, Yangzhou 225002, PR China;Department of Information Systems and Computing, Brunel University, Uxbridge, Middlesex, UB8 3PH, United Kingdom;Department of Information Systems and Computing, Brunel University, Uxbridge, Middlesex, UB8 3PH, United Kingdom

  • Venue:
  • Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper is concerned with the stability analysis problem for a new class of discrete-time recurrent neural networks with mixed time-delays. The mixed time-delays that consist of both the discrete and distributed time-delays are addressed, for the first time, when analyzing the asymptotic stability for discrete-time neural networks. The activation functions are not required to be differentiable or strictly monotonic. The existence of the equilibrium point is first proved under mild conditions. By constructing a new Lyapnuov-Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the discrete-time neural networks to be globally asymptotically stable. As an extension, we further consider the stability analysis problem for the same class of neural networks but with state-dependent stochastic disturbances. All the conditions obtained are expressed in terms of LMIs whose feasibility can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.