Convergent activation dynamics in continuous time networks
Neural Networks
A deterministic annealing neural network for convex programming
Neural Networks
Primal and dual assignment networks
IEEE Transactions on Neural Networks
Estimate of exponential convergence rate and exponential stability for neural networks
IEEE Transactions on Neural Networks
Analysis and design of an analog sorting network
IEEE Transactions on Neural Networks
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
Long-range out-of-sample properties of autoregressive neural networks
Neural Computation
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
A robust extended Elman backpropagation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
On the weight convergence of Elman networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Universal approach to study delayed dynamical systems
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Oscillatory behavior for a class of recurrent neural networks with time-varying input and delays
ICIC'11 Proceedings of the 7th international conference on Intelligent Computing: bio-inspired computing and applications
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
This paper studies the global output convergence of a class of recurrent neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying inputs. We establish two sufficient conditions for global output convergence of this class of neural networks. Symmetry in the connection weight matrix is not required in the present results which extend the existing ones.