A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Machine Learning
How to Make AdaBoost.M1 Work for Weak Base Classifiers by Changing Only One Line of the Code
ECML '02 Proceedings of the 13th European Conference on Machine Learning
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Information geometry of U-Boost and Bregman divergence
Neural Computation
Multiclass Boosting for Weak Classifiers
The Journal of Machine Learning Research
Robustifying AdaBoost by Adding the Naive Error Rate
Neural Computation
On the Consistency of Multiclass Classification Methods
The Journal of Machine Learning Research
Hi-index | 0.00 |
We discuss robustness against mislabeling in multiclass labels for classification problems and propose two algorithms of boosting, the normalized Eta-Boost.M and Eta-Boost.M, based on the Eta-divergence. Those two boosting algorithms are closely related to models of mislabeling in which the label is erroneously exchanged for others. For the two boosting algorithms, theoretical aspects supporting the robustness for mislabeling are explored. We apply the proposed two boosting methods for synthetic and real data sets to investigate the performance of these methods, focusing on robustness, and confirm the validity of the proposed methods.