Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Global exponential stability of delayed Hopfield neural networks
Neural Networks
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Robust Stability of Switched Cohen–Grossberg Neural Networks With Mixed Time-Varying Delays
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An analysis of global asymptotic stability of delayed cellular neural networks
IEEE Transactions on Neural Networks
An improved global asymptotic stability criterion for delayed cellular neural networks
IEEE Transactions on Neural Networks
Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Global Asymptotic Stability of Recurrent Neural Networks With Multiple Time-Varying Delays
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper analyzes the robustness of global exponential stability of recurrent neural networks subject to parameter uncertainty in connection weight matrix. Given a globally exponentially stable recurrent neural network, the problem to be addressed herein is how much parameter uncertainty in the connection weight matrix that the neural network can remain to be globally exponentially stable. We characterize the upper bounds of the parameter uncertainty for the recurrent neural networks to sustain global exponential stability. A numerical example is provided to illustrate the theoretical result.