Matrix analysis
Convergent activation dynamics in continuous time networks
Neural Networks
Neural and automata networks: dynamical behavior and applications
Neural and automata networks: dynamical behavior and applications
Discrete state neural networks and energies
Neural Networks
Global Stability of Neural Networks with Time-Varying Delays
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part II
Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay
Information Sciences: an International Journal
Novel delay-dependent exponential stability analysis for a class of delayed neural networks
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
Computers & Mathematics with Applications
Hi-index | 0.00 |
Globally convergent dynamics of a class of neural networks with normal connection matrices is studied by using the Lyapunov function method and spectral analysis of the connection matrices. It is shown that the networks are absolutely stable if and only if all the real parts of the eigenvalues of the connection matrices are nonpositive. This extends an existing result on symmetric neural networks to a larger class including certain asymmetric networks. Further extension of the present result to certain non-normal case leads naturally to a quasi-normal matrix condition, which may be interpreted as a generalization of the so-called principle of detailed balance for the connection weights or the quasi-symmetry condition that was previously proposed in the literature in association with symmetric neural networks. These results are of particular interest in neural optimization and classification problems.