How to Make AdaBoost.M1 Work for Weak Base Classifiers by Changing Only One Line of the Code

  • Authors:
  • Günther Eibl;Karl Peter Pfeiffer

  • Affiliations:
  • -;-

  • Venue:
  • ECML '02 Proceedings of the 13th European Conference on Machine Learning
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

If one has a multiclass classification problem and wants to boost a multiclass base classifier AdaBoost.M1 is a well known and widely applicated boosting algorithm. However AdaBoost.M1 does not work, if the base classifier is too weak. We show, that with a modification of only one line of AdaBoost.M1 one can make it usable for weak base classifiers, too. The resulting classifier AdaBoost.M1W is guaranteed to minimize an upper bound for a performance measure, called the guessing error, as long as the base classifier is better than random guessing. The usability of AdaBoost.M1W could be clearly demonstrated experimentally.