Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Leave-one-out Bounds for Support Vector Regression
CIMCA '05 Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce Vol-2 (CIMCA-IAWTIC'06) - Volume 02
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Hi-index | 0.00 |
An upper bound on the Leave-one-out (Loo) error forν-support vector regression (ν-SVR) ispresented. This bound is based on the geometrical concept ofspan. We can select the parameters of ν-SVR byminimizing this upper bound instead of the error itself, becausethe computation of the Loo error is extremely time consuming. Wealso can estimate the generalization performance ofν-SVR with the help of the upper bound. It is shownthat the bound presented herein provide informative and efficientapproximations of the generalization behavior based on two datasets.