Global convergence of neural networks with mixed time-varying delays and discontinuous neuron activations

  • Authors:
  • Jun Liu;Xinzhi Liu;Wei-Chau Xie

  • Affiliations:
  • Department of Applied Mathematics, University of Waterloo, Waterloo, Ontario, Canada N2L 3G1;Department of Applied Mathematics, University of Waterloo, Waterloo, Ontario, Canada N2L 3G1;Department of Civil and Environmental Engineering, University of Waterloo, Waterloo, Ontario, Canada N2L 3G1

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 2012

Quantified Score

Hi-index 0.07

Visualization

Abstract

In this paper, we investigate the dynamical behavior of a class of delayed neural networks with discontinuous neuron activations and general mixed time-delays involving both time-varying delays and distributed delays. Due to the presence of time-varying delays and distributed delays, the step-by-step construction of local solutions cannot be applied. This difficulty can be overcome by constructing a sequence of solutions to delayed dynamical systems with high-slope activations and show that this sequence converges to a desired Filippov solution of the discontinuous delayed neural networks. We then derive two sets of sufficient conditions for the global exponential stability and convergence of the neural networks, in terms of linear matrix inequalities (LMIs) and M-matrix properties (equivalently, some diagonally dominant conditions), respectively. Convergence behavior of both the neuron state and the neuron output are discussed. The obtained results extend previous work on global stability of delayed neural networks with Lipschitz continuous neuron activations, and neural networks with discontinuous neuron activations and only constant delays.