Computers & Mathematics with Applications
Global Exponential Convergence of Time-Varying Delayed Neural Networks with High Gain
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Journal of Computational and Applied Mathematics
Robust Stability Criterion for Delayed Neural Networks with Discontinuous Activation Functions
Neural Processing Letters
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Synchronization between Two Different Chaotic Neural Networks with Fully Unknown Parameters
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Information Sciences: an International Journal
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Delay-dependent H∞ and generalized H2 filtering for delayed neural networks
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Almost sure exponential stability of recurrent neural networks with Markovian switching
IEEE Transactions on Neural Networks
Boundedness and stability for the solutions of impulsive neural networks with time-varying delay
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Equilibrium Analysis for Improved Signal Range Model of Delayed Cellular Neural Networks
Neural Processing Letters
IEEE Transactions on Neural Networks
A new method for complete stability analysis of cellular neural networks with time delay
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Robust state estimation for neural networks with discontinuous activations
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Mathematics and Computers in Simulation
Information Sciences: an International Journal
Local stability conditions for discrete-time cascade locally recurrent neural networks
International Journal of Applied Mathematics and Computer Science - Computational Intelligence in Modern Control Systems
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Quasi-synchronization of delayed coupled networks with non-identical discontinuous nodes
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Information Sciences: an International Journal
Information Sciences: an International Journal
Hopfield neural networks with unbounded monotone activation functions
Advances in Artificial Neural Systems
Hi-index | 0.00 |
This paper introduces a general class of neural networks with arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. The discontinuities in the activations are an ideal model of the situation where the gain of the neuron amplifiers is very high and tends to infinity, while the delay accounts for the finite switching speed of the neuron amplifiers, or the finite signal propagation speed. It is known that the delay in combination with high-gain nonlinearities is a particularly harmful source of potential instability. The goal of this paper is to single out a subclass of the considered discontinuous neural networks for which stability is instead insensitive to the presence of a delay. More precisely, conditions are given under which there is a unique equilibrium point of the neural network, which is globally exponentially stable for the states, with a known convergence rate. The conditions are easily testable and independent of the delay. Moreover, global convergence in finite time of the state and output is investigated. In doing so, new interesting dynamical phenomena are highlighted with respect to the case without delay, which make the study of convergence in finite time significantly more difficult. The obtained results extend previous work on global stability of delayed neural networks with Lipschitz continuous neuron activations, and neural networks with discontinuous neuron activations but without delays.