C4.5: programs for machine learning
C4.5: programs for machine learning
First-order jk-clausal theories are PAC-learnable
Artificial Intelligence
Theories for mutagenicity: a study in first-order and feature-based induction
Artificial Intelligence - Special volume on empirical methods
Learning Logical Definitions from Relations
Machine Learning
Efficient algorithms for decision tree cross-validation
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
ILP '00 Proceedings of the 10th International Conference on Inductive Logic Programming
Top-down induction of first-order logical decision trees
Artificial Intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Efficient model selection for large-scale nearest-neighbor data mining
BNCOD'10 Proceedings of the 27th British national conference on Data Security and Security Data
Hi-index | 0.00 |
Cross-validation is a technique used in many different machine learning approaches. Straightforward implementation of this technique has the disadvantage of causing computational overhead. However, it has been shown that this overhead often consists of redundant computations, which can be avoided by performing all folds of the cross-validation in parallel. In this paper we study to what extent such a parallel algorithm is also useful in ILP. We discuss two issues: a) the existence of dependencies between parts of a query that limit the obtainable efficiency improvements and b) the combination of parallel cross-validation with query-packs. Tentative solutions are proposed and evaluated experimentally.