Nonlinear systems analysis (2nd ed.)
Nonlinear systems analysis (2nd ed.)
New theorems on global convergence of some dynamical systems
Neural Networks
Global convergence rate of recurrently connected neural networks
Neural Computation
A reference model approach to stability analysis of neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Estimate of exponential convergence rate and exponential stability for neural networks
IEEE Transactions on Neural Networks
Stability of asymmetric Hopfield networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A locally recurrent fuzzy neural network with support vector regression for dynamic-system modeling
IEEE Transactions on Fuzzy Systems
Hi-index | 0.00 |
This paper reveals two important characterizations of global exponential stability (GES) of a generic class of continuous-time recurrent neural networks. First, we show that GES of the neural networks can be fully characterized by global asymptotic stability (GAS) of the networks plus the condition that the maximum abscissa of spectral set of Jacobian matrix of the neural networks at the unique equilibrium point is less than zero. This result provides a very useful and direct way to distinguish GES from GAS for the neural networks. Second, we show that when the neural networks have small state feedback coefficients, the supremum of exponential convergence rates (ECRs) of trajectories of the neural networks is exactly equal to the absolute value of the maximum abscissa of spectral set of Jacobian matrix of the neural networks at the unique equilibrium point. Here, the supremum of ECRs indicates the potentially fastest speed of trajectory convergence. The obtained results are helpful in understanding the essence of GES and clarifying the difference between GES and GAS of the continuous-time recurrent neural networks.