New Results on Exponential Convergence for HRNNs with Continuously Distributed Delays in the Leakage Terms

  • Authors:
  • Yuehua Yu;Weidong Jiao

  • Affiliations:
  • College of Mathematics and Computer Science, Hunan University of Arts and Science, Changde, People's Republic of China 415000;College of Engineering, Zhejiang Normal University, Jinhua, People's Republic of China 321004

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper concerns with exponential convergence for a class of high-order recurrent neural networks with continuously distributed delays in the leakage terms. Without assuming the boundedness on the activation functions, some sufficient conditions are derived to ensure that all solutions of the networks converge exponentially to the zero point by using Lyapunov functional method and differential inequality techniques, which correct some recent results of Chen and Yang (Neural Comput Appl. doi:10.1007/s00521-012-1172-2, 2012). Moreover, we propose a new approach to prove the exponential convergence of HRNNs with continuously distributed leakage delays.