Communications of the ACM
A general lower bound on the number of examples needed for learning
Information and Computation
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Machine Learning
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Sequential sampling techniques for algorithmic learning theory
Theoretical Computer Science - Algorithmic learning theory (ALT 2000)
Can ILP be applied to large datasets?
ILP'09 Proceedings of the 19th international conference on Inductive logic programming
Hi-index | 0.00 |
Evaluations of advantages of Probabilistic Inductive Logic Programming (PILP) against ILP have not been conducted from a computational learning theory point of view. We propose a PILP framework, projection-based PILP, in which surjective projection functions are used to produce a "lossy" compression dataset from an ILP dataset. We present sample complexity results including conditions when projection-based PILP needs fewer examples than PAC. We experimentally confirm the theoretical bounds for the projection-based PILP in the Blackjack domain using Cellist, a system which machine learns Probabilistic Logic Automata. In our experiments projection-based PILP shows lower predictive error than the theoretical bounds and achieves substantially lower predictive error than ILP. To the authors' knowledge this is the first paper describing both a computer learning theory and related empirical results on an advantage of PILP against ILP.