Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Polynomial-Time Decomposition Algorithms for Support Vector Machines
Machine Learning
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
QP Algorithms with Guaranteed Accuracy and Run Time for Support Vector Machines
The Journal of Machine Learning Research
General Polynomial Time Decomposition Algorithms
The Journal of Machine Learning Research
The analysis of decomposition methods for support vector machines
IEEE Transactions on Neural Networks
On the convergence of the decomposition method for support vector machines
IEEE Transactions on Neural Networks
A study on SMO-type decomposition methods for support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Sequential Minimal Optimization (SMO) [14] is a major tool for solving convex quadratic optimization problems induced by Support Vector Machines (SVMs). It is based on the idea to iterativley solve subproblems of size two. In this work we will give a characterization of convex quadratic optimization problems, which can be solved with the SMO technique as well. In addition we will present an efficient 1/m- rate-certifying pair selection algorithm [8,13] leading to polynomial-time convergence rates for such problems.