AdaTree: Boosting a Weak Classifier into a Decision Tree

  • Authors:
  • Etienne Grossmann

  • Affiliations:
  • University of Kentucky, Lexington

  • Venue:
  • CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 6 - Volume 06
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a boosting method that results in a decision tree rather than a fixed linear sequence of classifiers. An equally correct statement is that we present a tree-growing method whose performance can be analysed in the framework of Adaboost. We argue that Adaboost can be improved by presenting the input to a sequence of weak classifiers, each one tuned to the conditional probability determined by the output of previous weak classifiers.As a result, the final classifier has a tree structure, rather than being linear, thus the name "Adatree".One of the consequences of the tree structure is that different input data may have different processing time.Early experimentation shows a reduced computation cost with respect to Adaboost. One of our intended applications is real-time detection, where cascades of boosted detectors have recently become successful.The reduced computation cost of the proposed method shows some potential for being used directly in detection problems, without need of a cascade.