Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Learning linear transformations
FOCS '96 Proceedings of the 37th Annual Symposium on Foundations of Computer Science
A spectral algorithm for learning mixture models
Journal of Computer and System Sciences - Special issue on FOCS 2002
The Spectral Method for General Mixture Models
SIAM Journal on Computing
Robust PCA and clustering in noisy mixtures
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
Foundations and Trends® in Theoretical Computer Science
Efficiently learning mixtures of two Gaussians
Proceedings of the forty-second ACM symposium on Theory of computing
Settling the Polynomial Learnability of Mixtures of Gaussians
FOCS '10 Proceedings of the 2010 IEEE 51st Annual Symposium on Foundations of Computer Science
Polynomial Learning of Distribution Families
FOCS '10 Proceedings of the 2010 IEEE 51st Annual Symposium on Foundations of Computer Science
A two-round variant of EM for Gaussian mixtures
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
On spectral learning of mixtures of distributions
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Separation of independent sources from correlated inputs
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
Principal Component Analysis (PCA) is one of the most widely used algorithmic techniques. When is PCA provably effective? What are its main limitations and how can we get around them? In this note, we discuss three specific challenges.