Boosting in the presence of noise

  • Authors:
  • Adam Tauman Kalai;Rocco A. Servedio

  • Affiliations:
  • Toyota Technological Institute at Chicago, Chicago, Il 60637, USA;Department of Computer Science, Columbia University, 1214 Amsterdam Avenue, Mailcode 0401, New York, NY 10027, USA

  • Venue:
  • Journal of Computer and System Sciences - Special issue: Learning theory 2003
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Boosting algorithms are procedures that ''boost'' low-accuracy weak learning algorithms to achieve arbitrarily high accuracy. Over the past decade boosting has been widely used in practice and has become a major research topic in computational learning theory. In this paper we study boosting in the presence of random classification noise, giving both positive and negative results. We show that a modified version of a boosting algorithm due to Mansour and McAllester (J. Comput. System Sci. 64(1) (2002) 103) can achieve accuracy arbitrarily close to the noise rate. We also give a matching lower bound by showing that no efficient black-box boosting algorithm can boost accuracy beyond the noise rate (assuming that one-way functions exist). Finally, we consider a variant of the standard scenario for boosting in which the ''weak learner'' satisfies a slightly stronger condition than the usual weak learning guarantee. We give an efficient algorithm in this framework which can boost to arbitrarily high accuracy in the presence of classification noise.