Improving adaptive boosting with a relaxed equation to update the sampling distribution

  • Authors:
  • Joaquín Torres-Sospedra;Carlos Hernández-Espinosa;Mercedes Fernández-Redondo

  • Affiliations:
  • Departamento de Ingenieria y Ciencia de los Computadores, Universitat Jaume I, Castellon, Spain;Departamento de Ingenieria y Ciencia de los Computadores, Universitat Jaume I, Castellon, Spain;Departamento de Ingenieria y Ciencia de los Computadores, Universitat Jaume I, Castellon, Spain

  • Venue:
  • IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Adaptive Boosting (Adaboost) is one of the most known methods to build an ensemble of neural networks. In this paper we briefly analyze and mix two of the most important variants of Adaboost, Averaged Boosting and Conservative Boosting, in order to build a robuster ensemble of neural networks. The mixed method called Averaged Conservative Boosting (ACB) applies the conservative equation used in Conserboost along with the averaged procedure used in Aveboost in order to update the sampling distribution. We have tested the methods with seven databases from the UCI repository. The results show that Averaged Conservative Boosting is the best performing method.