On the global output convergence of a class of recurrent neural networks with time-varying inputs

  • Authors:
  • Sanqing Hu;Derong Liu

  • Affiliations:
  • Department of Electrical and Computer Engineering, University of Illinois at Chicago, Chicago, IL 60607, USA;Department of Electrical and Computer Engineering, University of Illinois at Chicago, Chicago, IL 60607, USA

  • Venue:
  • Neural Networks
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper studies the global output convergence of a class of recurrent neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying inputs. We establish two sufficient conditions for global output convergence of this class of neural networks. Symmetry in the connection weight matrix is not required in the present results which extend the existing ones.