A guided tour of Chernoff bounds
Information Processing Letters
Expected Length of the Longest Probe Sequence in Hash Code Searching
Journal of the ACM (JACM)
Complexity of Quantifier Elimination in the Theory of Algebraically Closed Fields
Proceedings of the Mathematical Foundations of Computer Science 1984
Learning with matrix factorizations
Learning with matrix factorizations
Fast maximum margin matrix factorization for collaborative prediction
ICML '05 Proceedings of the 22nd international conference on Machine learning
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
Theory of semidefinite programming for Sensor Network Localization
Mathematical Programming: Series A and B
Convex multi-task feature learning
Machine Learning
Exact Matrix Completion via Convex Optimization
Foundations of Computational Mathematics
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
The power of convex relaxation: near-optimal matrix completion
IEEE Transactions on Information Theory
Matrix completion from a few entries
IEEE Transactions on Information Theory
Strong converse for identification via quantum channels
IEEE Transactions on Information Theory
On sparse representations in arbitrary redundant bases
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Recovering Low-Rank Matrices From Few Coefficients in Any Basis
IEEE Transactions on Information Theory
Exact matrix completion via convex optimization
Communications of the ACM
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
SIAM Journal on Optimization
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
A matrix hyperbolic cosine algorithm and applications
ICALP'12 Proceedings of the 39th international colloquium conference on Automata, Languages, and Programming - Volume Part I
Ranking and sparsifying a connection graph
WAW'12 Proceedings of the 9th international conference on Algorithms and Models for the Web Graph
Loose laplacian spectra of random hypergraphs
Random Structures & Algorithms
Accelerated Linearized Bregman Method
Journal of Scientific Computing
Hi-index | 0.02 |
This paper provides the best bounds to date on the number of randomly sampled entries required to reconstruct an unknown low-rank matrix. These results improve on prior work by Candès and Recht (2009), Candès and Tao (2009), and Keshavan et al. (2009). The reconstruction is accomplished by minimizing the nuclear norm, or sum of the singular values, of the hidden matrix subject to agreement with the provided entries. If the underlying matrix satisfies a certain incoherence condition, then the number of entries required is equal to a quadratic logarithmic factor times the number of parameters in the singular value decomposition. The proof of this assertion is short, self contained, and uses very elementary analysis. The novel techniques herein are based on recent work in quantum information theory.