Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Support Vector Machines for Pattern Classification
Support Vector Machines for Pattern Classification
Faster directions for second order SMO
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Hi-index | 0.02 |
We discuss a fast training method of support vector machines using Newton's method combined with fixed-size chunking. To speed up training, we limit the number of upper or lower bounded variables in the working set to two so that the corrections of the variables do not violate the bounding conditions. If similar working sets occur alternately, we merge these two working sets into one, and if similar working sets occur consecutively, we use incremental Cholesky factorization in calculating corrections. By computer experiments, we show that the proposed method is comparable to or faster than SMO (Sequential minimum optimization) using the second order information.