C4.5: programs for machine learning
C4.5: programs for machine learning
Beyond market baskets: generalizing association rules to correlations
SIGMOD '97 Proceedings of the 1997 ACM SIGMOD international conference on Management of data
Mining frequent patterns without candidate generation
SIGMOD '00 Proceedings of the 2000 ACM SIGMOD international conference on Management of data
CMAR: Accurate and Efficient Classification Based on Multiple Class-Association Rules
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Fast Algorithms for Mining Association Rules in Large Databases
VLDB '94 Proceedings of the 20th International Conference on Very Large Data Bases
Making Use of the Most Expressive Jumping Emerging Patterns for Classification
PADKK '00 Proceedings of the 4th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Current Issues and New Applications
CAEP: Classification by Aggregating Emerging Patterns
DS '99 Proceedings of the Second International Conference on Discovery Science
Selecting the right interestingness measure for association patterns
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
A Lazy Approach to Pruning Classification Rules
ICDM '02 Proceedings of the 2002 IEEE International Conference on Data Mining
An introduction to variable and feature selection
The Journal of Machine Learning Research
Benchmarking Attribute Selection Techniques for Discrete Class Data Mining
IEEE Transactions on Knowledge and Data Engineering
DeEPs: A New Instance-Based Lazy Discovery and Classification System
Machine Learning
MMAC: A New Multi-Class, Multi-Label Associative Classification Approach
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Introduction to Data Mining, (First Edition)
Introduction to Data Mining, (First Edition)
On Mining Instance-Centric Classification Rules
IEEE Transactions on Knowledge and Data Engineering
Multi-evidence, multi-criteria, lazy associative document classification
CIKM '06 Proceedings of the 15th ACM international conference on Information and knowledge management
Learning to Use a Learned Model: A Two-Stage Approach to Classification
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Lazy Associative Classification
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
The effect of threshold values on association rule based classification accuracy
Data & Knowledge Engineering
MCAR: multi-class classification based on association rule
AICCSA '05 Proceedings of the ACS/IEEE 2005 International Conference on Computer Systems and Applications
A Lazy Approach to Associative Classification
IEEE Transactions on Knowledge and Data Engineering
Direct mining of discriminative and essential frequent patterns via model-based search tree
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Direct Discriminative Pattern Mining for Effective Classification
ICDE '08 Proceedings of the 2008 IEEE 24th International Conference on Data Engineering
Considering re-occurring features in associative classifiers
PAKDD'05 Proceedings of the 9th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining
Hi-index | 0.00 |
Associative classification is characterized by accurate models and high model generation time. Most time is spent in extracting and postprocessing a large set of irrelevant rules, which are eventually pruned. We propose I-prune, an item-pruning approach that selects uninteresting items by means of an interestingness measure and prunes them as soon as they are detected. Thus, the number of extracted rules is reduced and model generation time decreases correspondingly. A wide set of experiments on real and synthetic data sets has been performed to evaluate I-prune and select the appropriate interestingness measure. The experimental results show that I-prune allows a significant reduction in model generation time, while increasing (or at worst preserving) model accuracy. Experimental evaluation also points to the chi-square measure as the most effective interestingness measure for item pruning. © 2012 Wiley Periodicals, Inc. © 2012 Wiley Periodicals, Inc.