Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Polynomial-Time Decomposition Algorithms for Support Vector Machines
Machine Learning
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
Fast Kernel Classifiers with Online and Active Learning
The Journal of Machine Learning Research
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
General polynomial time decomposition algorithms
COLT'05 Proceedings of the 18th annual conference on Learning Theory
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
On the convergence of the decomposition method for support vector machines
IEEE Transactions on Neural Networks
Rigorous proof of termination of SMO algorithm for support vector Machines
IEEE Transactions on Neural Networks
The Interplay of Optimization and Machine Learning Research
The Journal of Machine Learning Research
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Second-order smo improves svm online and active learning
Neural Computation
The Journal of Machine Learning Research
Cycle-breaking acceleration of SVM training
Neurocomputing
Candidate working set strategy based SMO algorithm in support vector machine
Information Processing and Management: an International Journal
A convergent hybrid decomposition algorithm model for SVM training
IEEE Transactions on Neural Networks
The multiple pairs SMO: a modified SMO algorithm for the acceleration of the SVM training
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Resilient approximation of kernel classifiers
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Computational Optimization and Applications
Using an iterative linear solver in an interior-point method for generating support vector machines
Computational Optimization and Applications
Tree Decomposition for Large-Scale SVM Problems
The Journal of Machine Learning Research
The Journal of Machine Learning Research
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Improved working set selection for larank
CAIP'11 Proceedings of the 14th international conference on Computer analysis of images and patterns - Volume Part I
Hi-index | 0.01 |
Support vector machines are trained by solving constrained quadratic optimization problems. This is usually done with an iterative decomposition algorithm operating on a small working set of variables in every iteration. The training time strongly depends on the selection of these variables. We propose the maximum-gain working set selection algorithm for large scale quadratic programming. It is based on the idea to greedily maximize the progress in each single iteration. The algorithm takes second order information from cached kernel matrix entries into account. We prove the convergence to an optimal solution of a variant termed hybrid maximum-gain working set selection. This method is empirically compared to the prominent most violating pair selection and the latest algorithm using second order information. For large training sets our new selection scheme is significantly faster.