RotBoost: A technique for combining Rotation Forest and AdaBoost

  • Authors:
  • Chun-Xia Zhang;Jiang-She Zhang

  • Affiliations:
  • School of Science, Xi'an Jiaotong University, Xi'an Shaanxi 710049, People's Republic of China;School of Science, Xi'an Jiaotong University, Xi'an Shaanxi 710049, People's Republic of China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2008

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower prediction error than either Rotation Forest or AdaBoost more often than the reverse. Meanwhile, RotBoost is found to perform much better than Bagging and MultiBoost. Through employing the bias and variance decompositions of error to gain more insight of the considered classification methods, RotBoost is seen to simultaneously reduce the bias and variance terms of a single tree and the decrement achieved by it is much greater than that done by the other ensemble methods, which leads RotBoost to perform best among the considered classification procedures. Furthermore, RotBoost has a potential advantage over AdaBoost of suiting parallel execution.