Boolean Feature Discovery in Empirical Learning
Machine Learning
The complexity of finite functions
Handbook of theoretical computer science (vol. A)
Computational Complexity of Machine Learning
Computational Complexity of Machine Learning
A General Framework for Knowledge Compilation
PDK '91 Proceedings of the International Workshop on Processing Declarative Knowledge
The complexity of theorem-proving procedures
STOC '71 Proceedings of the third annual ACM symposium on Theory of computing
Learning effective search control knowledge: an explanation-based approach
Learning effective search control knowledge: an explanation-based approach
IJCAI'87 Proceedings of the 10th international joint conference on Artificial intelligence - Volume 1
Selectively generalizing plans for problem-solving
IJCAI'85 Proceedings of the 9th international joint conference on Artificial intelligence - Volume 1
First order LUB approximations: characterization and algorithms
Artificial Intelligence - Special volume on reformulation
Artificial Intelligence
Representing CSPs with set-labeled diagrams: a compilation map
GKR'11 Proceedings of the Second international conference on Graph Structures for Knowledge Representation and Reasoning
Computation of Extensions of Seminormal Default Theories
Fundamenta Informaticae
Approximate Inference In Default Logic And Circumscription
Fundamenta Informaticae
Hi-index | 0.00 |
Knowledge compilation speeds inference by creating tractable approximations of a knowledge base, but this advantage is lost if the approximations are too large. We show how learning concept generalizations can allow for a more compact representation of the tractable theory. We also give a general induction rule for generating such concept generalizations. Finally, we prove that unless NP ⊆ non-uniform P, not all theories have small Horn least-upper-bound approximations.