On global asymptotic stability of recurrent neural networks with time-varying delays
Applied Mathematics and Computation
Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay
Information Sciences: an International Journal
IEEE Transactions on Neural Networks
Communications of the ACM
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Associative Learning of Integrate-and-Fire Neurons with Memristor-Based Synapses
Neural Processing Letters
Hi-index | 0.00 |
In this paper, we investigate the dynamics problem about the memristor-based recurrent network with bounded activation functions and bounded time-varying delays in the presence of strong external stimuli. It is shown that global exponential stability of such networks can be achieved when the external stimuli are sufficiently strong, without the need for other conditions. A sufficient condition on the bounds of stimuli is derived for global exponential stability of memristor-based recurrent networks. And all the results are in the sense of Filippov solutions. Simulation results illustrate the uses of the criteria to ascertain the global exponential stability of specific networks.