Boosting with Noisy Data: Some Views from Statistical Theory

  • Authors:
  • Wenxin Jiang

  • Affiliations:
  • Department of Statistics, Northwestern University Evanston, IL 60208, U.S.A.

  • Venue:
  • Neural Computation
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This letter is a comprehensive account of some recent findings about AdaBoost in the presence of noisy data when approached from the perspective of statistical theory. We start from the basic assumption of weak hypotheses used in AdaBoost and study its validity and implications on generalization error. We recommend studying the generalization error and comparing it to the optimal Bayes error when data are noisy. Analytic examples are provided to show that running the unmodified AdaBoost forever will lead to overfit. On the other hand, there exist regularized versions of AdaBoost that are consistent, in the sense that the resulting prediction will approximately attain the optimal performance in the limit of large training samples.