Evaluation of decision tree pruning with subadditive penalties

  • Authors:
  • Sergio García-Moratilla;Gonzalo Martínez-Muñoz;Alberto Suárez

  • Affiliations:
  • Universidad Autónoma de Madrid, Madrid, Spain;Universidad Autónoma de Madrid, Madrid, Spain;Universidad Autónoma de Madrid, Madrid, Spain

  • Venue:
  • IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent work on decision tree pruning[1] has brought to the attention of the machine learning community the fact that, in classification problems, the use of subadditive penalties in cost-complexity pruning has a stronger theoretical basis than the usual additive penalty terms. We implement cost-complexity pruning algorithms with general size-dependent penalties to confirm the results of[1] . Namely, that the family of pruned subtrees selected by pruning with a subadditive penalty of increasing strength is a subset of the family selected using additive penalties. Consequently, this family of pruned trees is unique, it is nested and it can be computed efficiently. However, in spite of the better theoretical grounding of cost-complexity pruning with subadditive penalties, we found no systematic improvements in the generalization performance of the final classification tree selected by cross-validation using subadditive penalties instead of the commonly used additive ones.