Using collaborative filtering to weave an information tapestry
Communications of the ACM - Special issue on information filtering
Shape and motion from image streams under orthography: a factorization method
International Journal of Computer Vision
Centering Sequences with Bounded Differences
Combinatorics, Probability and Computing
Semidefinite programming based algorithms for sensor network localization
ACM Transactions on Sensor Networks (TOSN)
Uncovering shared structures in multiclass classification
Proceedings of the 24th international conference on Machine learning
Matrix completion from a few entries
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
IEEE Transactions on Information Theory
Recovering the missing components in a large noisy low-rank matrix: application to SFM
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dense error correction via l1-minimization
IEEE Transactions on Information Theory
Matrix Completion from Noisy Entries
The Journal of Machine Learning Research
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
The Journal of Machine Learning Research
Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
SIAM Journal on Matrix Analysis and Applications
Fast communication: Some empirical advances in matrix completion
Signal Processing
A note on element-wise matrix sparsification via a matrix-valued Bernstein inequality
Information Processing Letters
Robust photometric stereo via low-rank matrix completion and recovery
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part III
Robust principal component analysis?
Journal of the ACM (JACM)
International Journal of Sensor Networks
The minimum-rank gram matrix completion via modified fixed point continuation method
Proceedings of the 36th international symposium on Symbolic and algebraic computation
Rank aggregation via nuclear norm minimization
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Distributed rating prediction in user generated content streams
Proceedings of the fifth ACM conference on Recommender systems
Alternating Direction Algorithms for $\ell_1$-Problems in Compressive Sensing
SIAM Journal on Scientific Computing
Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
SIAM Journal on Optimization
Hessian Matrix vs. Gauss-Newton Hessian Matrix
SIAM Journal on Numerical Analysis
Computational Statistics & Data Analysis
Exact matrix completion via convex optimization
Communications of the ACM
A Simpler Approach to Matrix Completion
The Journal of Machine Learning Research
Beating randomized response on incoherent matrices
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
TILT: Transform Invariant Low-Rank Textures
International Journal of Computer Vision
Accelerated singular value thresholding for matrix completion
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Active learning for online bayesian matrix factorization
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Low rank modeling of signed networks
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-source learning for joint analysis of incomplete multi-modality neuroimaging data
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
SIAM Journal on Optimization
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
Journal of Computational Neuroscience
Low rank metric learning for social image retrieval
Proceedings of the 20th ACM international conference on Multimedia
Learning spectral embedding via iterative eigenvalue thresholding
Proceedings of the 21st ACM international conference on Information and knowledge management
Optimizing over the growing spectrahedron
ESA'12 Proceedings of the 20th Annual European conference on Algorithms
Accelerated Linearized Bregman Method
Journal of Scientific Computing
Representing documents through their readers
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Beyond worst-case analysis in private singular vector computation
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Low-rank matrix completion using alternating minimization
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Link label prediction in signed social networks
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Social trust prediction using rank-k matrix recovery
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 754.91 |
This paper is concerned with the problem of recovering an unknown matrix from a small fraction of its entries. This is known as the matrix completion problem, and comes up in a great number of applications, including the famous Netflix Prize and other similar questions in collaborative filtering. In general, accurate recovery of a matrix from a small number of entries is impossible, but the knowledge that the unknown matrix has low rank radically changes this premise, making the search for solutions meaningful. This paper presents optimality results quantifying the minimum number of entries needed to recover a matrix of rank r exactly by any method whatsoever (the information theoretic limit). More importantly, the paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). This convex program simply finds, among all matrices consistent with the observed entries, that with minimum nuclear norm. As an example, we show that on the order of nr log(n) samples are needed to recover a random n × n matrix of rank r by any method, and to be sure, nuclear norm minimization succeeds as soon as the number of entries is of the form nrpolylog(n).