Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
An introduction to computational learning theory
An introduction to computational learning theory
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
A spectral algorithm for learning mixture models
Journal of Computer and System Sciences - Special issue on FOCS 2002
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
An information-theoretic analysis of hard and soft assignment methods for clustering
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
The spectral method for general mixture models
COLT'05 Proceedings of the 18th annual conference on Learning Theory
On spectral learning of mixtures of distributions
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Clustering Stability: An Overview
Foundations and Trends® in Machine Learning
An outlier-aware data clustering algorithm in mixture models
ICICS'09 Proceedings of the 7th international conference on Information, communications and signal processing
A spectral algorithm for learning Hidden Markov Models
Journal of Computer and System Sciences
Learning mixtures of spherical gaussians: moment methods and spectral decompositions
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Learning mixtures of arbitrary distributions over large discrete domains
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.00 |
We show that, given data from a mixture of k well-separated spherical Gaussians in ℜd, a simple two-round variant of EM will, with high probability, learn the parameters of the Gaussians to near-optimal precision, if the dimension is high (d ln k). We relate this to previous theoretical and empirical work on the EM algorithm.