Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
One-Against-All Methodology for Features Selection and Classification of Internet Applications
IPOM '09 Proceedings of the 9th IEEE International Workshop on IP Operations and Management
An empirical study of reducing multiclass classification methodologies
MLDM'13 Proceedings of the 9th international conference on Machine Learning and Data Mining in Pattern Recognition
Hi-index | 0.00 |
The one-against-all reduction from multiclass classification to binary classification is a standard technique used to solve multiclass problems with binary classifiers. We show that modifying this technique in order to optimize its error transformation properties results in a superior technique, both experimentally and theoretically. This algorithm can also be used to solve a more general classification problem "multi-label classification," which is the same as multiclass classification except that it allows multiple correct labels for a given example.