The Strength of Weak Learnability
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Machine Learning
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Information geometry of U-Boost and Bregman divergence
Neural Computation
Information geometry of U-Boost and Bregman divergence
Neural Computation
Robust Loss Functions for Boosting
Neural Computation
Robust boosting algorithm against mislabeling in multiclass problems
Neural Computation
Avoiding Boosting Overfitting by Removing Confusing Samples
ECML '07 Proceedings of the 18th European conference on Machine Learning
Tutorial series on brain-inspired computing: part 6: geometrical structure of boosting algorithm
New Generation Computing
A multiclass classification method based on decoding of binary classifiers
Neural Computation
An estimation of generalized bradley-terry models based on the em algorithm
Neural Computation
Asymmetric constraint optimization based adaptive boosting for cascade face detector
ICIC'11 Proceedings of the 7th international conference on Advanced Intelligent Computing Theories and Applications: with aspects of artificial intelligence
Hi-index | 0.00 |
AdaBoost can be derived by sequential minimization of the exponential loss function. It implements the learning process by exponentially reweighting examples according to classification results. However, weights are often too sharply tuned, so that AdaBoost suffers from the nonrobustness and overlearning. Wepropose a new boosting method that is a slight modification of AdaBoost. The loss function is defined by a mixture of the exponential loss and naive error loss functions. As a result, the proposed method incorporates the effect of forgetfulness into AdaBoost. The statistical significance of our method is discussed, and simulations are presented for confirmation.