Cost-Sensitive Boosting: Fitting an Additive Asymmetric Logistic Regression Model

  • Authors:
  • Qiu-Jie Li;Yao-Bin Mao;Zhi-Quan Wang;Wen-Bo Xiang

  • Affiliations:
  • School of Automation, Nanjing University of Sci. & Tech., Nanjing, P.R. China 210094;School of Automation, Nanjing University of Sci. & Tech., Nanjing, P.R. China 210094;School of Automation, Nanjing University of Sci. & Tech., Nanjing, P.R. China 210094;School of Automation, Nanjing University of Sci. & Tech., Nanjing, P.R. China 210094

  • Venue:
  • ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Conventional machine learning algorithms like boosting tend to equally treat misclassification errors that are not adequate to process certain cost-sensitive classification problems such as object detection. Although many cost-sensitive extensions of boosting by directly modifying the weighting strategy of correspond original algorithms have been proposed and reported, they are heuristic in nature and only proved effective by empirical results but lack sound theoretical analysis. This paper develops a framework from a statistical insight that can embody almost all existing cost-sensitive boosting algorithms: fitting an additive asymmetric logistic regression model by stage-wise optimization of certain criterions. Four cost-sensitive versions of boosting algorithms are derived, namely CSDA, CSRA, CSGA and CSLB which respectively correspond to Discrete AdaBoost, Real AdaBoost, Gentle AdaBoost and LogitBoost. Experimental results on the application of face detection have shown the effectiveness of the proposed learning framework in the reduction of the cumulative misclassification cost.