Boosting with averaged weight vectors

  • Authors:
  • Nikunj C. Oza

  • Affiliations:
  • Computational Sciences Division, NASA Ames Research Center, Moffett Field, CA

  • Venue:
  • MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

AdaBoost [5] is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence [7]. The idea is to make the next base model's errors uncorrelated with those of the previous model. Some researchers have pointed out the intuition that it is probably better to construct a distribution orthogonal to the mistake vectors of all the previous base models, but that this is not always possible [7]. We present an algorithm that attempts to come as close as possible to this goal in an efficient manner. We present experimental results demonstrating significant improvement over AdaBoost and the Totally Corrective boosting algorithm [7], which also attempts to satisfy this goal.