Robust PCA and clustering in noisy mixtures
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
Foundations and Trends® in Theoretical Computer Science
Spectral methods for matrices and tensors
Proceedings of the forty-second ACM symposium on Theory of computing
Efficiently learning mixtures of two Gaussians
Proceedings of the forty-second ACM symposium on Theory of computing
Communications of the ACM
Effective principal component analysis
SISAP'12 Proceedings of the 5th international conference on Similarity Search and Applications
Low rank approximation and regression in input sparsity time
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Learning mixtures of arbitrary distributions over large discrete domains
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.02 |
We present an algorithm for learning a mixture of distributions based on spectral projection. We prove a general property of spectral projection for arbitrary mixtures and show that the resulting algorithm is efficient when the components of the mixture are logconcave distributions in $\Re^n$ whose means are separated. The separation required grows with $k$, the number of components, and $\log n$. This is the first result demonstrating the benefit of spectral projection for general Gaussians and widens the scope of this method. It improves substantially on previous results, which focus either on the special case of spherical Gaussians or require a separation that has a considerably larger dependence on $n$.