Boosting lite: handling larger datasets and slower base classifiers

  • Authors:
  • Lawrence O. Hall;Robert E. Banfield;Kevin W. Bowyer;W. Philip Kegelmeyer

  • Affiliations:
  • Department of Computer Science & Engineering, University of South Florida, Tampa, Florida;Department of Computer Science & Engineering, University of South Florida, Tampa, Florida;Computer Science & Engineering, Notre Dame, IN;Sandia National Labs, Computational Sciences and Math Research Department

  • Venue:
  • MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we examine ensemble algorithms (Boosting Lite and Ivoting) that provide accuracy approximating a single classifier, but which require significantly fewer training examples. Such algorithms allow ensemble methods to operate on very large data sets or use very slow learning algorithms. Boosting Lite is compared with Ivoting, standard boosting, and building a single classifier. Comparisons are done on 11 data sets to which other approaches have been applied. We find that ensembles of support vector machines can attain higher accuracy with less data than ensembles of decision trees. We find that Ivoting may result in higher accuracy ensembles on some data sets, however Boosting Lite is generally able to indicate when boosting will increase overall accuracy.