Improving nonparametric regression methods by bagging and boosting

  • Authors:
  • Simone Borra;Agostino Di Ciaccio

  • Affiliations:
  • University of Rome "Tor Vergata", Italy;Department of Statistics, Probability, and Applied Statistics, University of Rome "La Sapienza", p. le. Aldo Moro 5, I-00185 Roma, Italy

  • Venue:
  • Computational Statistics & Data Analysis - Nonlinear methods and data mining
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers by assembling a collection of individual classifiers obtained resampling on the training sample. Bagging and boosting are well-known methods in the machine learning context and they have been proved to be successful in classification problems. In the regression context, the application of these techniques has received little investigation. Our aim is to analyse, by simulation studies, when boosting and bagging can reduce the training set error and the generalization error, using nonparametric regression methods as predictors. In this work, we will consider three methods: projection pursuit regression (PPR), multivariate adaptive regression splines (MARS), local learning based on recursive covering (DART).