Global Exponential Stability and Global Convergence in Finite Time of Neural Networks with Discontinuous Activations

  • Authors:
  • Sitian Qin;Xiaoping Xue

  • Affiliations:
  • Department of Mathematics, Harbin Institute of Technology, Harbin, People's Republic of China 150001;Department of Mathematics, Harbin Institute of Technology, Harbin, People's Republic of China 150001

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we consider a general class of neural networks, which have arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. Based on the topological degree theory and Lyapunov functional method, we provide some new sufficient conditions for the global exponential stability and global convergence in finite time of these delayed neural networks. Under these conditions the uniqueness of initial value problem (IVP) is proved. The exponential convergence rate can be quantitatively estimated on the basis of the parameters defining the neural network. These conditions are easily testable and independent of the delay. In the end some remarks and examples are discussed to compare the present results with the existing ones.