Incremental algorithm driven by error margins

  • Authors:
  • Gonzalo Ramos-Jiménez;José del Campo-Ávila;Rafael Morales-Bueno

  • Affiliations:
  • Departamento de Lenguajes y Ciencias de la Computacion, E.T.S. Ingenieria Informatica, Universidad de Malaga, Malaga, Spain;Departamento de Lenguajes y Ciencias de la Computacion, E.T.S. Ingenieria Informatica, Universidad de Malaga, Malaga, Spain;Departamento de Lenguajes y Ciencias de la Computacion, E.T.S. Ingenieria Informatica, Universidad de Malaga, Malaga, Spain

  • Venue:
  • DS'06 Proceedings of the 9th international conference on Discovery Science
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Incremental learning is a good approach for classification when data-sets are too large or when new examples can arrive at any time. Forgetting these examples while keeping only the relevant information lets us reduce memory requirements. The algorithm presented in this paper, called IADEM, has been developed using these approaches and other concepts such as Chernoff and Hoeffding bounds. The most relevant features of this new algorithm are: its capability to deal with datasets of any size for inducing accurate trees and its capacity to keep updated the estimation error of the tree that is being induced. This estimation of the error is fundamental to satisfy the user requirements about the desired error in the tree and to detect noise in the datasets.