An Improved AdaBoost Algorithm Based on Adaptive Weight Adjusting

  • Authors:
  • Lili Cheng;Jianpei Zhang;Jing Yang;Jun Ma

  • Affiliations:
  • College of Computer Science and Technology, Harbin Engineering University, Harbin, 150001, China;College of Computer Science and Technology, Harbin Engineering University, Harbin, 150001, China;College of Computer Science and Technology, Harbin Engineering University, Harbin, 150001, China;College of Computer Science and Technology, Harbin Engineering University, Harbin, 150001, China

  • Venue:
  • ADMA '07 Proceedings of the 3rd international conference on Advanced Data Mining and Applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The base classifier, which is trained by AdaBoost ensemble learning algorithm, has a constant weight for all test instances. From the view of iterative process of AdaBoost, every base classifier has good classification performance in a certain small area of input space, so the constant weight for different test samples is unreasonable. An improved AdaBoost algorithm based on adaptive weight adjusting is presented. The classifiers' selection and their weights are determined by full information behavior correlation which describes the correlation between test sample and base classifier. The method makes use of all scalars of base classifier's full information behavior, overcomes the problem of information losing. The results of simulated experiments show that the ensemble classification performance is improved greatly.