Improvement of Boosting Algorithm by Modifying the Weighting Rule

  • Authors:
  • Masayuki Nakamura;Hiroki Nomiya;Kuniaki Uehara

  • Affiliations:
  • Graduate School of Science and Technology, Kobe University, Nada, Kobe 657-8501, Japan E-mail: m-yuki@hcc1.bai.ne.jp;Graduate School of Science and Technology, Kobe University, Nada, Kobe 657-8501, Japan E-mail: nomiya@ai.cs.scitec.kobe-u.ac.jp;Graduate School of Science and Technology, Kobe University, Nada, Kobe 657-8501, Japan E-mail: uehara@kobe-u.ac.jp

  • Venue:
  • Annals of Mathematics and Artificial Intelligence
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

AdaBoost is a method for improving the classification accuracy of a given learning algorithm by combining hypotheses created by the learning alogorithms. One of the drawbacks of AdaBoost is that it worsens its performance when training examples include noisy examples or exceptional examples, which are called hard examples. The phenomenon causes that AdaBoost assigns too high weights to hard examples. In this research, we introduce the thresholds into the weighting rule of AdaBoost in order to prevent weights from being assigned too high value. During learning process, we compare the upper bound of the classification error of our method with that of AdaBoost, and we set the thresholds such that the upper bound of our method can be superior to that of AdaBoost. Our method shows better performance than AdaBoost.