Instance-Based Learning Algorithms
Machine Learning
Rule induction with CN2: some recent improvements
EWSL-91 Proceedings of the European working session on learning on Machine learning
Improved boosting algorithms using confidence-rated predictions
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A simple, fast, and effective rule learner
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Machine Learning
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Noise Elimination in Inductive Concept Learning: A Case Study in Medical Diagnosois
ALT '96 Proceedings of the 7th International Workshop on Algorithmic Learning Theory
Reduced complexity rule induction
IJCAI'91 Proceedings of the 12th international joint conference on Artificial intelligence - Volume 2
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Pruning classification rules with reference vector selection methods
ICAISC'10 Proceedings of the 10th international conference on Artificial intelligence and soft computing: Part I
Highly scalable and robust rule learner: performance evaluation and comparison
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
The reduced nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
An algorithm for a selective nearest neighbor decision rule (Corresp.)
IEEE Transactions on Information Theory
Simple Hybrid and Incremental Postpruning Techniques for Rule Induction
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
A new prepruning technique for rule induction is presented which applies instance reduction before rule induction. An empirical evaluation records the predictive accuracy and size of rule-sets generated from 24 datasets from the UCI Machine Learning Repository. Three instance reduction algorithms (Edited Nearest Neighbour, AllKnn and DROP5) are compared. Each one is used to reduce the size of the training set, prior to inducing a set of rules using Clark and Boswell's modification of CN2. A hybrid instance reduction algorithm (comprised of AllKnn and DROP5) is also tested. For most of the datasets, pruning the training set using ENN, AllKnn or the hybrid significantly reduces the number of rules generated by CN2, without adversely affecting the predictive performance. The hybrid achieves the highest average predictive accuracy.