Ensembling Heterogeneous Learning Models with Boosting

  • Authors:
  • Diego S. Nascimento;André L. Coelho

  • Affiliations:
  • Graduate Program in Applied Informatics, University of Fortaleza, Fortaleza, Brazil 60811-905;Graduate Program in Applied Informatics, University of Fortaleza, Fortaleza, Brazil 60811-905

  • Venue:
  • ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part I
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we investigate the potentials of a novel classifier ensemble scheme, referred to as heterogeneous boosting (HB), which aims at delivering higher levels of diversity by allowing that distinct learning algorithms be recruited to induce the different components of the boosting sequence. For the automatic design of the HB structures in accord with the nuances of the problem at hand, a genetic algorithm engine is adopted to work jointly with AdaBoost, the state-of-the-art boosting algorithm. To validate the novel approach, experiments involving well-known learning algorithms and classification datasets from the UCI repository are discussed. The accuracy, generalization, and diversity levels incurred with HB are matched against those delivered by AdaBoost working solely with RBF neural networks, with the first either significantly prevailing over or going in par with the latter in all the cases.