The nature of statistical learning theory
The nature of statistical learning theory
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Multi-dimensional Function Approximation and Regression Estimation
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
A tutorial on support vector regression
Statistics and Computing
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Analysis of SVM regression bounds for variable ranking
Neurocomputing
Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters
The Journal of Machine Learning Research
Fast and efficient strategies for model selection of Gaussian support vector machine
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Tuning SVM parameters by using a hybrid CLPSO-BFGS algorithm
Neurocomputing
Model selection for least squares support vector regressions based on small-world strategy
Expert Systems with Applications: An International Journal
SVM multiregression for nonlinear channel estimation in multiple-input multiple-output systems
IEEE Transactions on Signal Processing
A dynamic model selection strategy for support vector machine classifiers
Applied Soft Computing
Hi-index | 0.01 |
Multi-Input Multi-Output (MIMO) regression estimation problems widely exist in engineering fields. As an efficient approach for MIMO modeling, multi-dimensional support vector regression, named M-SVR, is generally capable of obtaining better predictions than many traditional methods. However, M-SVR is sensitive to the perturbation of hyper-parameters when facing small-scale sample problems, and most of currently used model selection methods for conventional SVR cannot be applied to M-SVR directly due to its special structure. In this paper, a fast and robust model selection algorithm for M-SVR is proposed. Firstly, a new training algorithm for M-SVR is proposed to reduce efficiently the numerical errors in training procedure. Based on this algorithm, a new leave-one-out (LOO) error estimate for M-SVR is derived through a virtual LOO cross-validation procedure. This LOO error estimate can be straightway calculated once a training process ended with less computational complexity than traditional LOO method. Furthermore, a robust implementation of this LOO estimate via Cholesky factorization is also proposed. Finally, the gradients of the LOO estimate are calculated, and the hyper-parameters with lowest LOO error can be found by means of gradient decent method. Experiments on toy data and real-life dynamical load identification problems are both conducted, demonstrating comparable results of the proposed algorithm in terms of generalization performance, numerical stability and computational cost.