The nature of statistical learning theory
The nature of statistical learning theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Pairwise classification and support vector machines
Advances in kernel methods
SIAM Journal on Optimization
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
A Database for Handwritten Text Recognition Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Neural Computation
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
AD-SVMs: A light extension of SVMs for multicategory classification
International Journal of Hybrid Intelligent Systems - Data Mining and Hybrid Intelligent Systems
A study on SMO-type decomposition methods for support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The All-Distances SVM is a single-objective light extension of the binary µ-SVM for multi-category classification that is competitive against multi-objective SVMs, such as One-against-the-Rest SVMs and One-against-One SVMs. Although the model takes into account considerably less constraints than previous formulations, it lacks of an efficient training algorithm, making its use with medium and large problems impracticable. In this paper, a Sequential Minimal Optimization-like algorithm is proposed to train the All-Distances SVM, making large problems abordable. Experimental results with public benchmark data are presented to show the performance of the AD-SVM trained with this algorithm against other single-objective multi-category SVMs.