Boosting recombined weak classifiers

  • Authors:
  • Juan J. Rodríguez;Jesús Maudes

  • Affiliations:
  • University of Burgos, Department of Civil Engineering, Escuela Politecnica Superior. c/Francisco de Vitoria s/n, 09006 Burgos, Spain;University of Burgos, Department of Civil Engineering, Escuela Politecnica Superior. c/Francisco de Vitoria s/n, 09006 Burgos, Spain

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2008

Quantified Score

Hi-index 0.10

Visualization

Abstract

Boosting is a set of methods for the construction of classifier ensembles. The differential feature of these methods is that they allow to obtain a strong classifier from the combination of weak classifiers. Therefore, it is possible to use boosting methods with very simple base classifiers. One of the most simple classifiers are decision stumps, decision trees with only one decision node. This work proposes a variant of the most well-known boosting method, AdaBoost. It is based on considering, as the base classifiers for boosting, not only the last weak classifier, but a classifier formed by the last r selected weak classifiers (r is a parameter of the method). If the weak classifiers are decision stumps, the combination of r weak classifiers is a decision tree. The ensembles obtained with the variant are formed by the same number of decision stumps than the original AdaBoost. Hence, the original version and the variant produce classifiers with very similar sizes and computational complexities (for training and classification). The experimental study shows that the variant is clearly beneficial.