An exercise in proving self-stabilization with a variant function
Information Processing Letters
Modern control theory (3rd ed.)
Modern control theory (3rd ed.)
ACM Computing Surveys (CSUR)
Topology representing networks
Neural Networks
Dynamic cell structure learns perfectly topology preserving map
Neural Computation
Computational Complexity of Lyapunov Stability Analysis Problems for a Class of Nonlinear Systems
SIAM Journal on Control and Optimization
Self-stabilizing systems in spite of distributed control
Communications of the ACM
DSN '01 Proceedings of the 2001 International Conference on Dependable Systems and Networks (formerly: FTCS)
An Exercise in Proving Self-Stabilization through Ljapunov Functions
ICDCS '01 Proceedings of the The 21st International Conference on Distributed Computing Systems
On the role of hierarchy for neural network interpretation
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Validating neural network-based online adaptive systems: a case study
Software Quality Control
Predicting with confidence – an improved dynamic cell structure
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
An approach to v&v of embedded adaptive systems
FAABS'04 Proceedings of the Third international conference on Formal Approaches to Agent-Based Systems
The AdaptiV approach to verification of adaptive systems
Proceedings of the Fifth International C* Conference on Computer Science and Software Engineering
Hi-index | 0.00 |
The paper presents the role of self-stabilization analysis in the design, verification and validation of the dynamics of an Adaptive Flight Control System (AFCS). Since the traditional self-stabilization approaches lack the flexibility to deal with the continuous adaptation of the neural network within the AFCS, the paper emphasizes an alternate self-stability analysis approach, namely Lyapunov's Second Method. A Lyapunov function for the neural network is constructed and used in presenting a formal mathematical proof that verifies the following claim: While learning from a fixed input manifold, the neural network is self-stabilizing in a Globally Asymptotically Stable manner. When dealing with variable data manifolds, we propose the need for a real-time stability monitor that can detect unstable state deviations. The test results based on the data collected from an F-15 flight simulator provide substantial heuristic evidence to support the idea of using a Lyapunov function to prove the self-stabilization properties of the neural network adaptation.