Designing Storage Efficient Decision Trees
IEEE Transactions on Computers
The nature of statistical learning theory
The nature of statistical learning theory
Classification by pairwise coupling
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Journal of Machine Learning Research
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Ensembles of nested dichotomies for multi-class problems
ICML '04 Proceedings of the twenty-first international conference on Machine learning
On the Computational Power of Winner-Take-All
Neural Computation
Improving Multiclass Pattern Recognition by the Combination of Two Strategies
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Margin Trees for High-dimensional Classification
The Journal of Machine Learning Research
Data-driven decomposition for multi-class classification
Pattern Recognition
Efficient Pairwise Classification
ECML '07 Proceedings of the 18th European conference on Machine Learning
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
A review on the combination of binary classifiers in multiclass problems
Artificial Intelligence Review
An improved DAG-SVM for multi-class classification
ICNC'09 Proceedings of the 5th international conference on Natural computation
Disease Liability Prediction from Large Scale Genotyping Data Using Classifiers with a Reject Option
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
One-against-all ensemble for multiclass pattern classification
Applied Soft Computing
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
New results on error correcting output codes of kernel machines
IEEE Transactions on Neural Networks
Binary tree of SVM: a new fast multiclass training and classification algorithm
IEEE Transactions on Neural Networks
Information Sciences: an International Journal
Traffic sign recognition using group sparse coding
Information Sciences: an International Journal
Hi-index | 0.07 |
One approach to multi-class classification consists in decomposing the original problem into a collection of binary classification tasks. The outputs of these binary classifiers are combined to produce a single prediction. Winner-takes-all, max-wins and tree voting schemes are the most popular methods for this purpose. However, tree schemes can deliver faster predictions because they need to evaluate less binary models. Despite previous conclusions reported in the literature, this paper shows that their performance depends on the organization of the tree scheme, i.e. the positions where each pairwise classifier is placed on the graph. Different metrics are studied for this purpose, proposing a new one that considers the precision and the complexity of each pairwise model, what makes the method to be classifier-dependent. The study is performed using Support Vector Machines (SVMs) as base classifiers, but it could be extended to other kind of binary classifiers. The proposed method, tested on benchmark data sets and on one real-world application, is able to improve the accuracy of other decomposition multi-class classifiers, producing even faster predictions.