Neurocomputing: foundations of research
Neurocomputing: foundations of research
Convergent activation dynamics in continuous time networks
Neural Networks
On the brain-state-in-a-convex-domain neural models
Neural Networks
Global attractivity in delayed Hopfield neural network models
SIAM Journal on Applied Mathematics
On the stability of globally projected dynamical systems
Journal of Optimization Theory and Applications
New theorems on global convergence of some dynamical systems
Neural Networks
Cellular neural networks and visual computing: foundations and applications
Cellular neural networks and visual computing: foundations and applications
Cellular Neural Networks
Cellular Neural Networks: Dynamics and Modelling (Mathematical Modelling: Theory and Applications)
Cellular Neural Networks: Dynamics and Modelling (Mathematical Modelling: Theory and Applications)
A reference model approach to stability analysis of neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Stability analysis of dynamical neural networks
IEEE Transactions on Neural Networks
A new neural network for solving linear and quadratic programming problems
IEEE Transactions on Neural Networks
A general methodology for designing globally convergent optimization neural networks
IEEE Transactions on Neural Networks
Estimate of exponential convergence rate and exponential stability for neural networks
IEEE Transactions on Neural Networks
On equilibria, stability, and instability of Hopfield neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Stability of asymmetric Hopfield networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Global convergence of delayed dynamical systems
IEEE Transactions on Neural Networks
Robust global exponential stability of Cohen-Grossberg neural networks with time delays
IEEE Transactions on Neural Networks
Neural network for quadratic optimization with bound constraints
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.02 |
In this paper, we present some general analysis on global convergence of the recurrent neural networks (RNNs) with projection mappings in the critical case when M(L,@C), a matrix related to the weight matrices and the activation mappings of the networks, is nonnegative definite for some positive diagonal matrix @C. Considerable stability results have been obtained for the RNNs in the noncritical case when M(L,@C) is positive definite. In contrast, only a few conclusions have been conducted under the critical conditions. Comparing with the existing critical studies, the present critical stability results in this paper require no additional assumption on the weight matrices, can be applied to the RNNs with general projection mappings other than nearest point projection mappings, and can serve for both two fundamental RNN models. The results established for several typical RNN models unify, sharpen or generalize most of the existing stability assertions. Two examples are given to show both theoretical importance and practical feasibility of the critical results obtained.