Communications of the ACM
On the learnability of Boolean formulae
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
Information Processing Letters
Structural complexity 1
Computational limitations on learning from examples
Journal of the ACM (JACM)
Boolean Feature Discovery in Empirical Learning
Machine Learning
Learning nonrecursive definitions of relations with LINUS
EWSL-91 Proceedings of the European working session on learning on Machine learning
An O(nlog log n) learning algorithm for DNF under the uniform distribution
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Exact learning of read-k disjoint DNF and not-so-disjoint DNF
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Learning k-term DNF formulas with an incomplete membership oracle
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Interactive Concept-Learning and Constructive Induction by Analogy
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
On the hardness of approximating minimization problems
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Linear time deterministic learning of k-term DNF
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On using the Fourier transform to learn Disjoint DNF
Information Processing Letters
On learning Read-k-Satisfy-j DNF
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
On the limits of proper learnability of subclasses of DNF formulas
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
An introduction to computational learning theory
An introduction to computational learning theory
On the Learnability of Disjunctive Normal Form Formulas
Machine Learning
On the learnability of Zn-DNF formulas (extended abstract)
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Learning Logical Definitions from Relations
Machine Learning
Zero Knowledge and the Chromatic Number
CCC '96 Proceedings of the 11th Annual IEEE Conference on Computational Complexity
Clique is hard to approximate within n1-
FOCS '96 Proceedings of the 37th Annual Symposium on Foundations of Computer Science
Learning disjunction of conjunctions
IJCAI'85 Proceedings of the 9th international joint conference on Artificial intelligence - Volume 1
Sharper Bounds for the Hardness of Prototype and Feature Selection
ALT '00 Proceedings of the 11th International Conference on Algorithmic Learning Theory
Hi-index | 0.00 |
In this paper, we study the possibility of Occam's razors for a widely studied class of Boolean Formulae: Disjunctive Normal Forms (DNF). An Occam's razor is an algorithm which compresses the knowledge of observations (examples) in small formulae. We prove that approximating the minimally consistent DNF formula, and a generalization of graph colorability, is very hard. Our proof technique is such that the stronger the complexity hypothesis used, the larger the inapproximability ratio obtained. Our ratio is among the first to integrate the three parameters of Occam's razors: the number of examples, the number of description attributes and the size of the target formula labelling the examples. Theoretically speaking, our result rules out the existence of efficient deterministic Occam's razor algorithms for DNF. Practically speaking, it puts a large worst-case lower bound on the formulae's sizes found by learning systems proceeding by rule searching.