Qualitative Analysis and Synthesis of Recurrent Neural Networks
Qualitative Analysis and Synthesis of Recurrent Neural Networks
Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, V. 13)
Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, V. 13)
Stability and chaos of a neural network with uncertain time delays
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Global exponential stability in lagrange sense of continuous-time recurrent neural networks
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Robust stability for interval Hopfield neural networks with time delay
IEEE Transactions on Neural Networks
Computers & Mathematics with Applications
Global Passivity of Stochastic Neural Networks with Time-Varying Delays
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Equilibrium Analysis for Improved Signal Range Model of Delayed Cellular Neural Networks
Neural Processing Letters
Passivity analysis of neural networks with discrete and distributed delays
International Journal of Systems, Control and Communications
Invariant set and attractor of discrete-time impulsive recurrent neural networks
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
LMI-Based lagrange stability of CGNNs with general activation functions and mixed delays
ICSI'12 Proceedings of the Third international conference on Advances in Swarm Intelligence - Volume Part I
Global robust exponential stability in lagrange sense for interval delayed neural networks
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
In this paper, global exponential stability in Lagrange sense is further studied for various continuous-time delayed recurrent neural network with two different types of activation functions. Based on the parameters of the systems, detailed estimation of global exponential attractive sets and positive invariant sets are presented without any hypothesis on the existence. It is also verified that outside the global exponential attracting set; i.e., within the global attraction domain, there is no equilibrium state, periodic state, almost periodic state, and chaos attractor of the neural network. These theoretical analysis narrows the search field of optimization computation, associative memories, chaos control and synchronization and provide convenience for applications.