The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear black-box modeling in system identification: a unified overview
Automatica (Journal of IFAC) - Special issue on trends in system identification
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Support Vectors Selection by Linear Programming
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 5 - Volume 5
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
A tutorial on support vector regression
Statistics and Computing
An efficient star acquisition method based on SVM with mixtures of kernels
Pattern Recognition Letters
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
Multikernel semiparametric linear programming support vector regression
Expert Systems with Applications: An International Journal
Measuring financial risk with generalized asymmetric least squares regression
Applied Soft Computing
Support-vector modeling of electromechanical coupling for microwave filter tuning
International Journal of RF and Microwave Computer-Aided Engineering
Computers and Industrial Engineering
Hi-index | 0.00 |
As a new sparse kernel modeling method, support vector regression (SVR) has been regarded as the state-of-the-art technique for regression and approximation. In [V.N. Vapnik, The Nature of Statistical Learning Theory, second ed., Springer-Verlag, 2000], Vapnik developed the @?-insensitive loss function for the support vector regression as a trade-off between the robust loss function of Huber and one that enables sparsity within the support vectors. The use of support vector kernel expansion provides us a potential avenue to represent nonlinear dynamical systems and underpin advanced analysis. However, in the standard quadratic programming support vector regression (QP-SVR), its implementation is often computationally expensive and sufficient model sparsity cannot be guaranteed. In an attempt to mitigate these drawbacks, this article focuses on the application of the soft-constrained linear programming support vector regression (LP-SVR) with hybrid kernel in nonlinear black-box systems identification. An innovative non-Mercer hybrid kernel is explored by leveraging the flexibility of LP-SVR in choosing the kernel functions. The simulation results demonstrate the ability to use more general kernel function and the inherent performance advantage of LP-SVR to QP-SVR in terms of model sparsity and computational efficiency.