An Empirical Comparison of Pruning Methods for Ensemble Classifiers

  • Authors:
  • Terry Windeatt;Gholamreza Ardeshir

  • Affiliations:
  • -;-

  • Venue:
  • IDA '01 Proceedings of the 4th International Conference on Advances in Intelligent Data Analysis
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many researchers have shown that ensemble methods such as Boosting and Bagging improve the accuracy of classification. Boosting and Bagging perform well with unstable learning algorithms such as neural networks or decision trees. Pruning decision tree classifiers is intended to make trees simpler and more comprehensible and avoid over-fitting. However it is known that pruning individual classifiers of an ensemble does not necessarily lead to improved generalisation. Examples of individual tree pruning methods are Minimum Error Pruning (MEP), Error-based Pruning (EBP), Reduced-Error Pruning(REP), Critical Value Pruning (CVP) and Cost-Complexity Pruning (CCP). In this paper, we report the results of applying Boosting and Bagging with these five pruning methods to eleven datasets.