Convergent activation dynamics in continuous time networks
Neural Networks
The Handbook of Brain Theory and Neural Networks
The Handbook of Brain Theory and Neural Networks
Global convergence rate of recurrently connected neural networks
Neural Computation
Stability analysis of dynamical neural networks
IEEE Transactions on Neural Networks
Stability analysis of Hopfield-type neural networks
IEEE Transactions on Neural Networks
Asymptotic behavior of irreducible excitatory networks of analog graded-response neurons
IEEE Transactions on Neural Networks
Estimate of exponential convergence rate and exponential stability for neural networks
IEEE Transactions on Neural Networks
On equilibria, stability, and instability of Hopfield neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Long-range out-of-sample properties of autoregressive neural networks
Neural Computation
Hi-index | 0.00 |
A reason for applying the direct method of Lyapunov to artificial neural networks (ANNs) is to design dynamical neural networks so that they exhibit global asymptotic stability. Lyapunov functions that frequently appear in the ANN literature include the quadratic function, the Persidskii function, and the Luré-Postnikov function. This contribution revisits the quadratic function and shows that via Krasovskii-like stability criteria, it is possible to have a very simple and systematic procedure to obtain not only new and generalized results but also well-known sufficient conditions for convergence established recently by non-Lyapunov methods, such as the matrix measure and nonlinear measure.