C4.5: programs for machine learning
C4.5: programs for machine learning
The power of sampling in knowledge discovery
PODS '94 Proceedings of the thirteenth ACM SIGACT-SIGMOD-SIGART symposium on Principles of database systems
Separate-and-Conquer Rule Learning
Artificial Intelligence Review
Learning Logical Definitions from Relations
Machine Learning
Sampling Large Databases for Association Rules
VLDB '96 Proceedings of the 22th International Conference on Very Large Data Bases
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
A Study of Two Sampling Methods for Analyzing Large Datasets with ILP
Data Mining and Knowledge Discovery
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Windowing has been proposed as a procedure for efficient memory use in the ID3 decision tree learning algorithm. However, previous work has shown that windowing may often lead to a decrease in performance. In this work, we try to argue that rule learning algorithms are more appropriate for windowing than decision tree algorithms, because the former typically learn and evaluate rules independently and are thus less susceptible to changes in class distributions. Most importantly, we present a new windowing algorithm that achieves additional gains in efficiency by saving promising rules and removing examples covered by these rules from the learning window. While the presented algorithm is only suitable for redundant, noise-free data sets, we will also briefly discuss the problem of noisy data for windowing algorithms.