Parameter inference of cost-sensitive boosting algorithms

  • Authors:
  • Yanmin Sun;A. K. C. Wong;Yang Wang

  • Affiliations:
  • Pattern Analysis and Machine Intelligence Lab, University of Waterloo;Pattern Analysis and Machine Intelligence Lab, University of Waterloo;Pattern Discovery Software Ltd

  • Venue:
  • MLDM'05 Proceedings of the 4th international conference on Machine Learning and Data Mining in Pattern Recognition
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Several cost-sensitive boosting algorithms have been reported as effective methods in dealing with class imbalance problem. Misclassification costs, which reflect the different level of class identification importance, are integrated into the weight update formula of AdaBoost algorithm. Yet, it has been shown that the weight update parameter of AdaBoost is induced so as the training error can be reduced most rapidly. This is the most crucial step of AdaBoost in converting a weak learning algorithm into a strong one. However, most reported cost-sensitive boosting algorithms ignore such a property. In this paper, we come up with three versions of cost-sensitive AdaBoost algorithms where the parameters for sample weight updating are induced. Then, their identification abilities on the small classes are tested on four “real world” medical data sets taken from UCI Machine Learning Database based on F-measure. Our experimental results show that one of our proposed cost-sensitive AdaBoost algorithms is superior in achieving the best identification ability on the small class among all reported cost-sensitive boosting algorithms.