Stochastic boosting algorithms

  • Authors:
  • Ajay Jasra;Christopher C. Holmes

  • Affiliations:
  • Department of Mathematics, Imperial College London, London, UK SW7 2AZ;Department of Statistics, University of Oxford, Oxford, UK OX1 3TG

  • Venue:
  • Statistics and Computing
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this article we develop a class of stochastic boosting (SB) algorithms, which build upon the work of Holmes and Pintore (Bayesian Stat. 8, Oxford University Press, Oxford, 2007). They introduce boosting algorithms which correspond to standard boosting (e.g. Bühlmann and Hothorn, Stat. Sci. 22:477---505, 2007) except that the optimization algorithms are randomized; this idea is placed within a Bayesian framework. We show that the inferential procedure in Holmes and Pintore (Bayesian Stat. 8, Oxford University Press, Oxford, 2007) is incorrect and further develop interpretational, computational and theoretical results which allow one to assess SB's potential for classification and regression problems. To use SB, sequential Monte Carlo (SMC) methods are applied. As a result, it is found that SB can provide better predictions for classification problems than the corresponding boosting algorithm. A theoretical result is also given, which shows that the predictions of SB are not significantly worse than boosting, when the latter provides the best prediction. We also investigate the method on a real case study from machine learning.