A Lazy Approach to Pruning Classification Rules

  • Authors:
  • Elena Baralis;Paolo Garza

  • Affiliations:
  • -;-

  • Venue:
  • ICDM '02 Proceedings of the 2002 IEEE International Conference on Data Mining
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Associative classification is a promising technique forthe generation of highly precise classifiers. Previous workspropose several clever techniques to prune the huge set ofgenerated rules, with the twofold aim of selecting a smallset of high quality rules, and reducing the chance of overfitting.In this paper, we argue that pruning should be reducedto a minimum and that the availability of a large rule basemay improve the precision of the classifier, without affectingits performance. In L3 (Live and Let Live), a new algorithmfor associative classification, a lazy pruning technique iterativelydiscards all rules that only yield wrong case classifications.Classification is performed in two steps. Initially, ruleswhich have already correctly classified at least one trainingcase, sorted by confidence, are considered. If the caseis still unclassified, the remaining rules (unused during thetraining phase) are considered, again sorted by confidence.Extensive experiments on 26 databases from the UCImachine learning database repository show that L3 improvesthe classification precision with respect to previousapproaches.