Global attractivity in delayed Hopfield neural network models
SIAM Journal on Applied Mathematics
Global exponential stability of delayed Hopfield neural networks
Neural Networks
Global exponential convergence of recurrent neural networks with variable delays
Theoretical Computer Science
A note on stability of analog neural networks with time delays
IEEE Transactions on Neural Networks
Stability analysis for neural dynamics with time-varying delays
IEEE Transactions on Neural Networks
Global stability for cellular neural networks with time delay
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Theoretical Computer Science
Hi-index | 5.23 |
In this paper, we derive some new conditions for absolute exponential stability (AEST) of a class of recurrent neural networks with multiple and variable delays. By using the Holder's inequality and the Young's inequality to estimate the derivatives of the Lyapunov functionals, we are able to establish more general results than some existing ones. The first type of conditions established involves the convex combinations of column-sum and row-sum dominance of the neural network weight matrices, while the second type involves the p-norm of the weight matrices with p ∈ [1, +∞].