Learning decision rules in noisy domains
Proceedings of Expert Systems '86, The 6Th Annual Technical Conference on Research and development in expert systems III
International Journal of Man-Machine Studies - Special Issue: Knowledge Acquisition for Knowledge-based Systems. Part 5
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
On Estimating Probabilities in Tree Pruning
EWSL '91 Proceedings of the European Working Session on Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Linear Machine Decision Trees
Simplifying decision trees: A survey
The Knowledge Engineering Review
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Boosted Tree Ensembles for Solving Multiclass Problems
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Collective-agreement-based pruning of ensembles
Computational Statistics & Data Analysis
Computational Statistics & Data Analysis
Artificial Intelligence Review
Hi-index | 0.00 |
Many researchers have shown that ensemble methods such as Boosting and Bagging improve the accuracy of classification. Boosting and Bagging perform well with unstable learning algorithms such as neural networks or decision trees. Pruning decision tree classifiers is intended to make trees simpler and more comprehensible and avoid over-fitting. However it is known that pruning individual classifiers of an ensemble does not necessarily lead to improved generalisation. Examples of individual tree pruning methods are Minimum Error Pruning (MEP), Error-based Pruning (EBP), Reduced-Error Pruning(REP), Critical Value Pruning (CVP) and Cost-Complexity Pruning (CCP). In this paper, we report the results of applying Boosting and Bagging with these five pruning methods to eleven datasets.