A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Robust Real-Time Face Detection
International Journal of Computer Vision
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Generic Object Recognition with Boosting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Learning an Alphabet of Shape and Appearance for Multi-Class Object Detection
International Journal of Computer Vision
IEEE Transactions on Knowledge and Data Engineering
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Shedding light on the asymmetric learning capability of AdaBoost
Pattern Recognition Letters
RUSBoost: A Hybrid Approach to Alleviating Class Imbalance
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Error-correcting output codes based ensemble feature extraction
Pattern Recognition
Detecting pedestrians on a Movement Feature Space
Pattern Recognition
Hi-index | 0.01 |
We introduce a multi-class generalization of AdaBoost with binary weak-learners. We use a vectorial codification to represent class labels and a multi-class exponential loss function to evaluate classifier responses. This representation produces a set of margin values that provide a range of punishments for failures and rewards for successes. Moreover, the stage-wise optimization of this model introduces an asymmetric boosting procedure whose costs depend on the number of classes separated by each weak-learner. In this way the boosting algorithm takes into account class imbalances when building the ensemble. The experiments performed compare this new approach favorably to AdaBoost.MH, GentleBoost and the SAMME algorithms.