Learning Logical Definitions from Relations
Machine Learning
A Refinement Operator for Theories
ILP '01 Proceedings of the 11th International Conference on Inductive Logic Programming
Learning Logic Programs with Neural Networks
ILP '01 Proceedings of the 11th International Conference on Inductive Logic Programming
Using the Bottom Clause and Mode Declarations on FOL Theory Revision from Examples
ILP '08 Proceedings of the 18th international conference on Inductive Logic Programming
Towards Machine Learning of Grammars and Compilers of Programming Languages
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
MC-TopLog: complete multi-clause learning guided by a top theory
ILP'11 Proceedings of the 21st international conference on Inductive Logic Programming
Learning theories using estimation distribution algorithms and (reduced) bottom clauses
ILP'11 Proceedings of the 21st international conference on Inductive Logic Programming
Hi-index | 0.00 |
Most ILP systems employ the covering algorithm whereby hypotheses are constructed iteratively clause by clause. Typically the covering algorithm is greedy in the sense that each iteration adds the best clause according to some local evaluation criterion. Some typical problems of the covering algorithm are: unnecessarily long hypotheses, difficulties in handling recursion, difficulties in learning multiple predicates. This paper investigates a non-covering approach to ILP, implemented as a Prolog program called HYPER, whose goals were: use intensional background knowledge, handle recursion well, and enable multi-predicate learning. Experimental results in this paper may appear surprising in the view of the very high combinatorial complexity of the search space associated with the non-covering approach.