Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
The Strength of Weak Learnability
Machine Learning
Learning decision trees using the Fourier spectrum
STOC '91 Proceedings of the twenty-third annual ACM symposium on Theory of computing
An improved boosting algorithm and its implications on learning complexity
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Exact learning Boolean functions via the monotone theory
Information and Computation
Boosting a weak learning algorithm by majority
Information and Computation
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
Journal of Computer and System Sciences
Matching is as easy as matrix inversion
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
Machine Learning
Machine Learning
Discovering all most specific sentences
ACM Transactions on Database Systems (TODS)
Agnostically learning decision trees
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Hi-index | 0.00 |
In this paper we extend the Monotone Theory to the PAC-learning Model with membership queries. Using this extention we show that a DNF formula that has at least one "1/poly-heavy" clause in one of its CNF representation (a clause that is not satisfied with probability 1/poly(n, s) where n is the number of variables and s is the number of terms in f) with respect to a distribution D is weakly learnable under this distribution. So DNF that are not weakly learnable under the distribution D do not have any "1/poly-heavy" clauses in any of their CNF representations.A DNF f is called τ-CDNF if there is τ' τ and a CNF representation of f that contains poly(n,s) clauses that τ'-approximates f according to a distribution D. We show that the class of all τ-CDNF is weakly (τ + ε)- PAC-learnable with membership queries under the distribution D.We then show how to change our algorithm to a parallel algorithm that runs in polylogarithmic time with a polynomial number of processors. In particular, decision trees are (strongly) PAC-learnable with membership queries under any distribution in parallel in polylogarithmic time with a polynomial number of processors. Finally, we show that no efficient parallel exact learning algorithm exists for decision trees.