Combining bagging, boosting, rotation forest and random subspace methods

  • Authors:
  • Sotiris Kotsiantis

  • Affiliations:
  • Department of Mathematics, University of Patras, Patras, Greece

  • Venue:
  • Artificial Intelligence Review
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Bagging, boosting, rotation forest and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-classifiers. Boosting and rotation forest algorithms are considered stronger than bagging and random subspace methods on noise-free data. However, there are strong empirical indications that bagging and random subspace methods are much more robust than boosting and rotation forest in noisy settings. For this reason, in this work we built an ensemble of bagging, boosting, rotation forest and random subspace methods ensembles with 6 sub-classifiers in each one and then a voting methodology is used for the final prediction. We performed a comparison with simple bagging, boosting, rotation forest and random subspace methods ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique had better accuracy in most cases.