Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations

  • Authors:
  • Zhenyuan Guo;Lihong Huang

  • Affiliations:
  • College of Mathematics and Econometrics, Hunan University, Changsha, People's Republic of China 410082;College of Mathematics and Econometrics, Hunan University, Changsha, People's Republic of China 410082

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper studies the global output convergence of a class of recurrent delayed neural networks with time-varying inputs. We consider non-decreasing activations which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumptions of Lipschitz continuity and boundedness on the activation functions, which are usually required in most of the existing works. Due to the possible discontinuities of the activations functions, we introduce a suitable notation of limit to study the convergence of the output of the recurrent delayed neural networks. Under suitable assumptions on the interconnection matrices and the time-varying inputs, we establish a sufficient condition for global output convergence of this class of neural networks. The convergence results are useful in solving some optimization problems and in the design of recurrent delayed neural networks with discontinuous neuron activations.