Robust principal component analysis?
Journal of the ACM (JACM)
Rank aggregation via nuclear norm minimization
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Identifying users from their rating patterns
Proceedings of the 2nd Challenge on Context-Aware Movie Recommendation
Exact matrix completion via convex optimization
Communications of the ACM
A Simpler Approach to Matrix Completion
The Journal of Machine Learning Research
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
SIAM Journal on Optimization
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
Loose laplacian spectra of random hypergraphs
Random Structures & Algorithms
Accelerated Linearized Bregman Method
Journal of Scientific Computing
Hi-index | 754.85 |
We present novel techniques for analyzing the problem of low-rank matrix recovery. The methods are both considerably simpler and more general than previous approaches. It is shown that an unknown matrix of rank can be efficiently reconstructed from only randomly sampled expansion coefficients with respect to any given matrix basis. The number quantifies the “degree of incoherence” between the unknown matrix and the basis. Existing work concentrated mostly on the problem of “matrix completion” where one aims to recover a low-rank matrix from randomly selected matrix elements. Our result covers this situation as a special case. The proof consists of a series of relatively elementary steps, which stands in contrast to the highly involved methods previously employed to obtain comparable results. In cases where bounds had been known before, our estimates are slightly tighter. We discuss operator bases which are incoherent to all low-rank matrices simultaneously. For these bases, we show that randomly sampled expansion coefficients suffice to recover any low-rank matrix with high probability. The latter bound is tight up to multiplicative constants.