Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Machine Learning
Incorporating Prior Knowledge into Boosting
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Logistic Regression, AdaBoost and Bregman Distances
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Inference for the Generalization Error
Machine Learning
Information geometry of U-Boost and Bregman divergence
Neural Computation
Probability Estimates for Multi-class Classification by Pairwise Coupling
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.