Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Paper: Maximal lyapunov functions and domains of attraction for autonomous nonlinear systems
Automatica (Journal of IFAC)
Automatica (Journal of IFAC)
Computation of Lyapunov functions for smooth nonlinear systems using convex optimization
Automatica (Journal of IFAC)
A computational method for determining quadratic lyapunov functions for non-linear systems
Automatica (Journal of IFAC)
Hi-index | 22.14 |
This paper develops a computational approach for characterizing the stability regions of constrained nonlinear systems. A decision function is constructed that allows arbitrary initial states to be queried for inclusion within the stability region. Data essential to the construction process are generated by simulating the nonlinear system with multiple initial states. Using special procedures based on known properties of the stability region, the state data are randomly selected so that they are concentrated in desirable locations near the boundary of the stability region. Selected states belong either to the stability region or do not, thus producing a two-class pattern recognition problem. Support vector machine learning, applied to this problem, determines the decision function. Special techniques are introduced that significantly improve the accuracy and efficiency of the learning process. Numerical examples illustrate the effectiveness of the overall approach.