Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Pairwise classification and support vector machines
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Probability Estimates for Multi-class Classification by Pairwise Coupling
The Journal of Machine Learning Research
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Model selection plays a key role in the performance of support vector machines (SVMs). At present, nearly all researches are based on binary classification and focus on how to estimate the generalization performance of SVMs effectively and efficiently. For problems with more than two classes, where a classifier is typically constructed by combining several binary SVMs [8], most researchers simply select all binary SVM models simultaneously in one hyper-parameter space. Though this all-in-one method works well, there is another choice – the one-in-one method where each binary SVM model is selected independently and separately. In this paper, we compare the two methods for multi-class SVMs with the one-against-one strategy [8]. Their properties are discussed and their performance is analyzed based on experimental results.