A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
An EDBoost algorithm towards robust face recognition in JPEG compressed domain
Image and Vision Computing
A learning-based algorithm for geometric labeling of indoor images
TELE-INFO'06 Proceedings of the 5th WSEAS international conference on Telecommunications and informatics
Hi-index | 0.00 |
In simulation studies boosting algorithms seem to be susceptible to noise. This article applies Ada.Boost.M2 used with decision stumps to the digit recognition example, a simulated data set with attribute noise. Although the final model is both simple and complex enough, boosting fails to reach the Bayes error. A detailed analysis shows some characteristics of the boosting trials which influence the lack of fit.