Learning decision trees from random examples needed for learning
Information and Computation
Elements of information theory
Elements of information theory
Rank-r decision trees are a subclass of r-decision lists
Information Processing Letters
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
On the learnability of discrete distributions
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Estimating a mixture of two product distributions
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Learning mixtures of arbitrary gaussians
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Evolutionary Trees Can be Learned in Polynomial Time in the Two-State General Markov Model
SIAM Journal on Computing
A Spectral Algorithm for Learning Mixtures of Distributions
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
A Two-Round Variant of EM for Gaussian Mixtures
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Multilinear formulas and skepticism of quantum computing
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Learning nonsingular phylogenies and hidden Markov models
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Toward privacy in public databases
TCC'05 Proceedings of the Second international conference on Theory of Cryptography
Efficient learning of Naive Bayes classifiers under class-conditional classification noise
ICML '06 Proceedings of the 23rd international conference on Machine learning
Application of a generalization of russo's formula to learning from multiple random oracles
Combinatorics, Probability and Computing
Separating populations with wide data: a spectral analysis
ISAAC'07 Proceedings of the 18th international conference on Algorithms and computation
PAC learning axis-aligned mixtures of gaussians with no separation assumption
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Hi-index | 0.00 |
We consider the problem of learning mixtures of product distributions over discrete domains in the distribution learning framework introduced by Kearns et al. [19]. We give a poly (n/ \in ) time algorithm for learning a mixture of k arbitrary product distributions over the n-dimensional Boolean cube {0,1}^n to accuracy , for any constant k. Previous poly(n)-time algorithms could only achieve this for k = 2 product distributions; our result answers an open question stated independently in [8] and [15]. We further give evidence that no polynomial time algorithm can succeed when k is superconstant, by reduction from a notorious open problem in PAC learning. Finally, we generalize our poly(n/ \in) time algorithm to learn any mixture of k = O(1) product distributions over {0, 1, . . . , b}^n, for any b = O(1).