Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
Advanced fuzzy cellular neural network: Application to CT liver images
Artificial Intelligence in Medicine
IEEE Transactions on Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Global asymptotic stability for neural network models with distributed delays
Mathematical and Computer Modelling: An International Journal
Object recognition using multilayer Hopfield neural network
IEEE Transactions on Image Processing
Hopfield neural networks for affine invariant matching
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper, we focus on the stability problem for a class of stochastic delayed recurrent neural networks. Different from the traditional stability criteria, we introduce and study a new stability criterion: the mean-square exponential input-to-state stability. To the best of our knowledge, this new stability criterion has never been discussed in the field of stochastic recurrent neural networks. The main objective of the paper is to fill the gap. With the help of the Lyapunov-Krasovskii functional, stochastic analysis theory and Ito's formula, we prove that the addressed system is mean-square exponentially input-to-state stable. Moreover, two numerical examples and their simulations are presented to verify the theoretical results well.