The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Alpha seeding for support vector machines
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Machine Learning
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Convex Optimization
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Incremental Support Vector Learning: Analysis, Implementation and Applications
The Journal of Machine Learning Research
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
IEEE Transactions on Neural Networks
Optimized Cutting Plane Algorithm for Large-Scale Risk Minimization
The Journal of Machine Learning Research
Hi-index | 0.00 |
We propose a new stopping condition for a Support Vector Machine (SVM) solver which precisely reflects the objective of the Leave-One-Out error computation. The stopping condition guarantees that the output on an intermediate SVM solution is identical to the output of the optimal SVM solution with one data point excluded from the training set. A simple augmentation of a general SVM training algorithm allows one to use a stopping criterion equivalent to the proposed sufficient condition. A comprehensive experimental evaluation of our method shows consistent speedup of the exact LOO computation by our method, up to the factor of 13 for the linear kernel. The new algorithm can be seen as an example of constructive guidance of an optimization algorithm towards achieving the best attainable expected risk at optimal computational cost.