Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
An information-theoretic analysis of hard and soft assignment methods for clustering
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Communications of the ACM
The spectral method for general mixture models
COLT'05 Proceedings of the 18th annual conference on Learning Theory
On spectral learning of mixtures of distributions
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Document clustering using linear partitioning hyperplanes and reallocation
AIRS'04 Proceedings of the 2004 international conference on Asian Information Retrieval Technology
Effective principal component analysis
SISAP'12 Proceedings of the 5th international conference on Similarity Search and Applications
Hi-index | 0.02 |
We show that, given data from a mixture of k well-separated spherical Gaussians in Rn, a simple two-round variant of EM will, with high probability, learn the centers of the Gaussians to near-optimal precision, if the dimension is high (n ≫ log k). We relate this to previous theoretical and empirical work on the EM algorithm.