An Empirical Study of MetaCost Using Boosting Algorithms

  • Authors:
  • Kai Ming Ting

  • Affiliations:
  • -

  • Venue:
  • ECML '00 Proceedings of the 11th European Conference on Machine Learning
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

MetaCost is a recently proposed procedure that converts an error-based learning algorithm into a cost-sensitive algorithm. This paper investigates two important issues centered on the procedure which were ignored in the paper proposing MetaCost. First, no comparison was made between MetaCost's final model and the internal cost-sensitive classifier on which MetaCost depends. It is credible that the internal cost-sensitive classifier may outperform the final model without the additional computation required to derive the final model. Second, MetaCost assumes its internal cost-sensitive classifier is obtained by applying a minimum expected cost criterion. It is unclear whether violation of the assumption has an impact on MetaCost's performance. We study these issues using two boosting procedures, and compare with the performance of the original form of MetaCost which employs bagging.