Robust Loss Functions for Boosting

  • Authors:
  • Takafumi Kanamori;Takashi Takenouchi;Shinto Eguchi;Noboru Murata

  • Affiliations:
  • Department of Mathematical and Computing Sciences, Tokyo Institute of Technology, Tokyo 152-8552, Japan kanamori@is.titech.ac.jp;Nara Institute of Science and Technology, Ikoma, Nara 630-0192, Japan ttakashi@is.naist.jp;Institute of Statistical Mathematics, and Department of Statistical Science, Graduate University of Advanced Studies, Minato-ku, Tokyo 106-8569, Japan eguchi@ism.ac.jp;School of Science and Engineering, Waseda University, Shinjuku, Tokyo 169-8555, Japan noboru.murata@eb.waseda.ac.jp

  • Venue:
  • Neural Computation
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.