Machine Learning
Pairwise classification and support vector machines
Advances in kernel methods
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Pairwise Coupling Classification with Correcting Classifiers
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Journal of Machine Learning Research
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Comparing Combination Rules of Pairwise Neural Networks Classifiers
Neural Processing Letters
Robust and efficient multiclass SVM models for phrase pattern recognition
Pattern Recognition
A review on the combination of binary classifiers in multiclass problems
Artificial Intelligence Review
Automatic digital modulation recognition using support vector machines and genetic algorithm
ISNN'05 Proceedings of the Second international conference on Advances in neural networks - Volume Part II
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Computers and Operations Research
Enhancing directed binary trees for multi-class classification
Information Sciences: an International Journal
Hi-index | 0.00 |
A common way of constructing a multiclass classifier is by combining the outputs of several binary ones, according to an error-correcting output code (ECOC) scheme. The combination is typically done via a simple nearest-neighbor rule that finds the class that is closest in some sense to the outputs of the binary classifiers. For these nearest-neighbor ECOCs, we improve existing bounds on the error rate of the multiclass classifier given the average binary distance. The new bounds provide insight into the one-versus-rest and all-pairs matrices, which are compared through experiments with standard datasets. The results also show why elimination (also known as DAGSVM) and Hamming decoding often achieve the same accuracy.