A dual method for certain positive semidefinite quadratic programming problems
SIAM Journal on Scientific and Statistical Computing
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Matrix computations (3rd ed.)
Solving semidefinite quadratic problems within nonsmooth optimization algorithms
Computers and Operations Research
Solving the quadratic programming problem arising in support vector classification
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Provably Fast Training Algorithms for Support Vector Machines
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
The Interplay of Optimization and Machine Learning Research
The Journal of Machine Learning Research
Considering Cost Asymmetry in Learning Classifiers
The Journal of Machine Learning Research
Classification model selection via bilevel programming
Optimization Methods & Software - Mathematical programming in data mining and machine learning
An efficient active set method for SVM training without singular inner problems
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A Least-squares Approach to Direct Importance Estimation
The Journal of Machine Learning Research
Computational Optimization and Applications
Multiple incremental decremental learning of support vector machines
IEEE Transactions on Neural Networks
Convergence improvement of active set training for support vector regressors
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Using an iterative linear solver in an interior-point method for generating support vector machines
Computational Optimization and Applications
A new algorithm for training SVMs using approximate minimal enclosing balls
CIARP'10 Proceedings of the 15th Iberoamerican congress conference on Progress in pattern recognition, image analysis, computer vision, and applications
Selective block minimization for faster convergence of limited memory large-scale linear models
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Review: Supervised classification and mathematical optimization
Computers and Operations Research
Hi-index | 0.00 |
We propose an active set algorithm to solve the convex quadratic programming (QP) problem which is the core of the support vector machine (SVM) training. The underlying method is not new and is based on the extensive practice of the Simplex method and its variants for convex quadratic problems. However, its application to large-scale SVM problems is new. Until recently the traditional active set methods were considered impractical for large SVM problems. By adapting the methods to the special structure of SVM problems we were able to produce an efficient implementation. We conduct an extensive study of the behavior of our method and its variations on SVM problems. We present computational results comparing our method with Joachims' SVMlight (see Joachims, 1999). The results show that our method has overall better performance on many SVM problems. It seems to have a particularly strong advantage on more difficult problems. In addition this algorithm has better theoretical properties and it naturally extends to the incremental mode. Since the proposed method solves the standard SVM formulation, as does SVMlight, the generalization properties of these two approaches are identical and we do not discuss them in the paper.