Convergent activation dynamics in continuous time networks
Neural Networks
Frustration, stability, and delay-induced oscillations in a neural network model
SIAM Journal on Applied Mathematics
Exponential stability of Cohen-Grossberg neural networks
Neural Networks
A dual neural network for kinematic control of redundant robotmanipulators
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Existence, learning, and replication of periodic motions in recurrent neural networks
IEEE Transactions on Neural Networks
Existence and learning of oscillations in recurrent neural networks
IEEE Transactions on Neural Networks
Weight adaptation and oscillatory correlation for image segmentation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Global stability for cellular neural networks with time delay
IEEE Transactions on Neural Networks
Exponential stability and periodic oscillatory solution in BAM networks with delays
IEEE Transactions on Neural Networks
Oscillatory neural networks for robotic yo-yo control
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
New results for robust stability of dynamical neural networks with discrete time delays
Expert Systems with Applications: An International Journal
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Periodic oscillation and exponential stability of a class of competitive neural networks
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
The paper presents theoretical results on the global exponential periodicity and global exponential stability of a class of recurrent neural networks with various general activation functions and time-varying delays. The general activation functions include monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions. For each class of activation functions, testable algebraic criteria for ascertaining global exponential periodicity and global exponential stability of a class of recurrent neural networks are derived by using the comparison principle and the theory of monotone operator. Furthermore, the rate of exponential convergence and bounds of attractive domain of periodic oscillations or equilibrium points are also estimated. The convergence analysis based on the generalization of activation functions widens the application scope for the model design of neural networks. In addition, the new effective analytical method enriches the toolbox for the qualitative analysis of neural networks.