Improved Boosting Algorithms Using Confidence-rated Predictions

  • Authors:
  • Robert E. Schapire;Yoram Singer

  • Affiliations:
  • AT&T Labs, Shannon Laboratory, 180 Park Avenue, Room A279, Florham Park, NJ 07932-0971, USA. schapire@research.att.com;AT&T Labs, Shannon Laboratory, 180 Park Avenue, Room A277, Florham Park, NJ 07932-0971, USA. singer@research.att.com

  • Venue:
  • Machine Learning - The Eleventh Annual Conference on computational Learning Theory
  • Year:
  • 1999

Quantified Score

Hi-index 0.02

Visualization

Abstract

We describe several improvements to Freund and Schapire‘s AdaBoost boosting algorithm, particularly in a setting in which hypotheses mayassign confidences to each of their predictions. We give a simplifiedanalysis of AdaBoost in this setting, and we show how this analysiscan be used to find improved parameter settings as well as a refinedcriterion for training weak hypotheses. We give a specific method forassigning confidences to the predictions of decision trees, a methodclosely related to one used by Quinlan. This method also suggests atechnique for growing decision trees which turns out to be identical toone proposed by Kearns and Mansour.We focus next on how to apply the new boosting algorithms to multiclassclassification problems, particularly to the multi-label case in whicheach example may belong to more than one class. We give two boostingmethods for this problem, plus a third method based on output coding.One of these leads to a new method for handlingthe single-label case which is simpler but as effective as techniquessuggested by Freund and Schapire. Finally, we give some experimentalresults comparing a few of the algorithms discussed in this paper.