A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Predictive Approaches for Choosing Hyperparameters in Gaussian Processes
Neural Computation
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
A fast grid search method in support vector regression forecasting time series
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.00 |
In least squares support vector (LS-SVM), the key challenge lies in the selection of free parameters such as kernel parameters and tradeoff parameter. However, when a large number of free parameters are involved in LS-SVM, the commonly used grid search method for model selection is intractable. In this paper, SLOO-MPS is proposed for tuning multiple parameters for LS-SVM to overcome this problem. This method is based on optimizing the smooth leave- one-out error via a gradient descent algorithm and feasible to compute. Extensive empirical comparisons confirm the feasibility and validation of the SLOO-MPS.