Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain

  • Authors:
  • M. Forti;P. Nistri;D. Papini

  • Affiliations:
  • Dipt. di Ingegneria dell'Informazione, Univ. di Siena, Italy;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces a general class of neural networks with arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. The discontinuities in the activations are an ideal model of the situation where the gain of the neuron amplifiers is very high and tends to infinity, while the delay accounts for the finite switching speed of the neuron amplifiers, or the finite signal propagation speed. It is known that the delay in combination with high-gain nonlinearities is a particularly harmful source of potential instability. The goal of this paper is to single out a subclass of the considered discontinuous neural networks for which stability is instead insensitive to the presence of a delay. More precisely, conditions are given under which there is a unique equilibrium point of the neural network, which is globally exponentially stable for the states, with a known convergence rate. The conditions are easily testable and independent of the delay. Moreover, global convergence in finite time of the state and output is investigated. In doing so, new interesting dynamical phenomena are highlighted with respect to the case without delay, which make the study of convergence in finite time significantly more difficult. The obtained results extend previous work on global stability of delayed neural networks with Lipschitz continuous neuron activations, and neural networks with discontinuous neuron activations but without delays.