Learning decision rules in noisy domains
Proceedings of Expert Systems '86, The 6Th Annual Technical Conference on Research and development in expert systems III
C4.5: programs for machine learning
C4.5: programs for machine learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalization Bounds for Decision Trees
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
An analysis of reduced error pruning
Journal of Artificial Intelligence Research
Analysis of a complexity-based pruning scheme for classification trees
IEEE Transactions on Information Theory
k-norm misclassification rate estimation for decision trees
ASC '07 Proceedings of The Eleventh IASTED International Conference on Artificial Intelligence and Soft Computing
Hi-index | 0.00 |
The pruning phase is one of the necessary steps in decision tree induction. Existing pruning algorithms tend to have some or all of the following difficulties: 1) lack of theoretical support; 2) high computational complexity; 3) dependence on validation; 4) complicated implementation. The 2-norm pruning algorithm proposed here addresses all of the above difficulties. This paper demonstrates the experimental results of the comparison among the 2-norm pruning algorithm and two classical pruning algorithms, the Minimal Cost-Complexity algorithm (used in CART) and the Error-based pruning algorithm (used in C4.5), and confirms that the 2-norm pruning algorithm is superior in accuracy and speed.