A deterministic annealing neural network for convex programming
Neural Networks
Global stability of cellular neural networks with constant and variable delays
Nonlinear Analysis: Theory, Methods & Applications
New conditions on global stability of Cohen-Grossberg neural networks
Neural Computation
M-matrices and global convergence of discontinuous neural networks: Research Articles
International Journal of Circuit Theory and Applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Primal and dual assignment networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Analysis and design of an analog sorting network
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper studies the global output convergence of a class of recurrent delayed neural networks with time-varying inputs. We consider non-decreasing activations which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumptions of Lipschitz continuity and boundedness on the activation functions, which are usually required in most of the existing works. Due to the possible discontinuities of the activations functions, we introduce a suitable notation of limit to study the convergence of the output of the recurrent delayed neural networks. Under suitable assumptions on the interconnection matrices and the time-varying inputs, we establish a sufficient condition for global output convergence of this class of neural networks. The convergence results are useful in solving some optimization problems and in the design of recurrent delayed neural networks with discontinuous neuron activations.