Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
A tutorial on ν-support vector machines: Research Articles
Applied Stochastic Models in Business and Industry - Statistical Learning
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
A DC-programming algorithm for kernel selection
ICML '06 Proceedings of the 23rd international conference on Machine learning
Two-dimensional solution path for support vector regression
ICML '06 Proceedings of the 23rd international conference on Machine learning
IEEE Transactions on Neural Networks
Learning nonlinear hybrid systems: from sparse optimization to support vector regression
Proceedings of the 16th international conference on Hybrid systems: computation and control
Hi-index | 0.00 |
This paper presents the ν-SVM and theν-SVR full regularization paths along with aleave-one-out inspired stopping criterion and an efficientimplementation. In the ν-SVR method, two parameters areprovided by the user: the regularization parameter Candνwhich settles the width of the ν-tube. Inthe classical ν-SVM method, parameter νisan lower bound on the number of support vectors in the solution.Based on the previous works of [1,2], extensions of regularizationpaths for SVM and SVR are proposed and permit to automaticallycompute the solution path by varying νor theregularization parameter.