Boosting Methods for Regression

  • Authors:
  • Nigel Duffy;David Helmbold

  • Affiliations:
  • Computer Science Department, University of California, Santa Cruz, Santa Cruz, CA 95064, USA. nigeduff@cse.ucsc.edu;Computer Science Department, University of California, Santa Cruz, Santa Cruz, CA 95064, USA. dph@cse.ucsc.edu

  • Venue:
  • Machine Learning
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we examine ensemble methods for regression that leverage or “boost” base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its strong theoretical guarantees. We present several gradient descent leveraging algorithms for regression and prove AdaBoost-style bounds on their sample errors using intuitive assumptions on the base learners. We bound the complexity of the regression functions produced in order to derive PAC-style bounds on their generalization errors. Experiments validate our theoretical results.