Communications of the ACM
Artificial intelligence (2nd ed.)
Artificial intelligence (2nd ed.)
Generalized subsumption and its applications to induction and redundancy
Artificial Intelligence
Model inference incorporating generalization
Journal of Information Processing
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Learning complicated concepts reliably and usefully
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Learning theories in a subset of a polyadic logic
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Machine Learning
Machine Learning
Inductive Inference from Good Examples
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Learning recursive functions: A survey
Theoretical Computer Science
Refinement of uncertain rule bases via reduction
International Journal of Approximate Reasoning
Hi-index | 0.00 |
We study what kind of data may ease the computational complexity of learning of Horn clause theories (in Gold's paradigm) and Boolean functions (in PAC-learning paradigm). We give several definitions of good data (basic and generative representative sets), and develop data-driven algorithms that learn faster from good examples, and degenerate to learn in the limit from the "worst" possible examples. We show that Horn clause theories, k-term DNF and general DNF Boolean functions are polynomially learnable from generative representative presentations.