Selectively grouping neurons in recurrent networks of lateral inhibition
Neural Computation
Permitted and forbidden sets in symmetric threshold-linear networks
Neural Computation
Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, V. 13)
Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, V. 13)
Global exponential stability of competitive neural networks with different time scales
IEEE Transactions on Neural Networks
Passivity Analysis of Dynamic Neural Networks with Different Time-scales
Neural Processing Letters
Passivity analysis for neuro identifier with different time-scales
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
Hi-index | 0.00 |
This paper studies the complete convergence of a class of neural networks with different time scales under the assumption that the activation functions are unsaturated piecewise linear functions. Under this assumption, there are multiple equilibrium points in the neural network. Traditional methods cannot be used in this neural network. Complete convergence is proved by constructing an energy-like function. Simulations are employed to illustrate the theory.