Passivity Analysis of Dynamic Neural Networks with Different Time-scales
Neural Processing Letters
International Journal of Systems Science - Advances in Sliding Mode Observation and Estimation (Part Two)
Passivity analysis of dynamic neural networks with different time-scales
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
We derive a condition for robust local stability of multilayer recurrent neural networks with two hidden layers. The stability condition follows from linking theories about linearization, robustness analysis of linear systems under nonlinear perturbation, and matrix inequalities. A characterization of the basin of attraction of the origin is given in terms of the level set of a quadratic Lyapunov function. Similar to the NLq theory, the local stability is imposed around the origin and the apparent basin of attraction is made large by applying the criterion, while the proven basin of attraction is relatively small due to conservatism of the criterion. Modification of the dynamic backpropagation by the new stability condition is discussed and illustrated by simulation examples