PUBLIC: A Decision Tree Classifier that Integrates Building and Pruning

  • Authors:
  • Rajeev Rastogi;Kyuseok Shim

  • Affiliations:
  • Bell Laboratories, 600 Mountain Ave., Murray Hill, NJ 07974, USA. rastogi@bell-labs.com;Korea Advanced Institute of Science and Technology, and Advanced Information Technology Research Center, 373-1 Kusong-dong, Yusong-gu, Taejon 305-701, South Korea. shim@cs.kaist.ac.kr

  • Venue:
  • Data Mining and Knowledge Discovery
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Classification is an important problem in data mining. Given a database of records, each with a class label, a classifier generates a concise and meaningful description for each class that can be used to classify subsequent records. A number of popular classifiers construct decision trees to generate class models. These classifiers first build a decision tree and then prune subtrees from the decision tree in a subsequent ipruning phase to improve accuracy and prevent “overfitting”.Generating the decision tree in two distinct phases could result in a substantial amount of wasted effort since an entire subtree constructed in the first phase may later be pruned in the next phase. In this paper, we propose PUBLIC, an improved decision tree classifier that integrates the second “pruning” phase with the initial “building” phase. In PUBLIC, a node is not expanded during the building phase, if it is determined that it will be pruned during the subsequent pruning phase. In order to make this determination for a node, before it is expanded, PUBLIC computes a lower bound on the minimum cost subtree rooted at the node. This estimate is then used by PUBLIC to identify the nodes that are certain to be pruned, and for such nodes, not expend effort on splitting them. Experimental results with real-life as well as synthetic data sets demonstrate the effectiveness of PUBLIC's integrated approach which has the ability to deliver substantial performance improvements.