Decision Tree Learning Using a Bayesian Approach at Each Node

  • Authors:
  • Mirela Andronescu;Mark Brodie

  • Affiliations:
  • University of Washington, Seattle, USA WA 98195;Simpson College, Indianola, USA IA 50125

  • Venue:
  • Canadian AI '09 Proceedings of the 22nd Canadian Conference on Artificial Intelligence: Advances in Artificial Intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We explore the problem of learning decision trees using a Bayesian approach, called TREBBLE (TREe Building by Bayesian LE- arning), in which a population of decision trees is generated by constructing trees using probability distributions at each node. Predictions are made either by using Bayesian Model Averaging to combine information from all the trees (TREBBLE-BMA) or by using the single most likely tree (TREBBLE-MAP), depending on what is appropriate for the particular application domain. We show on benchmark data sets that this method is more accurate than the traditional decision tree learning algorithm C4.5 and is as accurate as the Bayesian method SimTree while being much simpler to understand and implement. In many application domains, such as help-desks and medical diagnoses, a decision tree needs to be learned from a prior tree (provided by an expert) and some (usually small) amount of training data. We show how TREBBLE-MAP can be used to learn a single tree that performs better than using either the prior tree or the training data alone.