Extending adaboost to iteratively vary its base classifiers

  • Authors:
  • Érico N. de Souza;Stan Matwin

  • Affiliations:
  • School of Information Technology and Engineering, University of Ottawa, Ottawa, ON, Canada;School of Information Technology and Engineering, University of Ottawa, Ottawa, ON, Canada

  • Venue:
  • Canadian AI'11 Proceedings of the 24th Canadian conference on Advances in artificial intelligence
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces AdaBoost Dynamic, an extension of AdaBoost.M1 algorithm by Freund and Shapire. In this extension we use different "weak" classifiers in subsequent iterations of the algorithm, instead of AdaBoost's fixed base classifier. The algorithm is tested with various datasets from UCI database, and results show that the algorithm performs equally well as AdaBoost with the best possible base learner for a given dataset. This result therefore relieves a machine learning analyst from having to decide which base classifier to use.