When Is There a Representer Theorem? Vector Versus Matrix Regularizers
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Solving low-rank matrix completion problems efficiently
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Accurate low-rank matrix recovery from a small number of linear measurements
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Average case analysis of multichannel sparse recovery using convex relaxation
IEEE Transactions on Information Theory
Projected Landweber iteration for matrix completion
Journal of Computational and Applied Mathematics
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
The Journal of Machine Learning Research
ADMiRA: atomic decomposition for minimum rank approximation
IEEE Transactions on Information Theory
Proceedings of the 6th International COnference
Optimum subspace learning and error correction for tensors
ECCV'10 Proceedings of the 11th European conference on computer vision conference on Computer vision: Part III
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Fast communication: Some empirical advances in matrix completion
Signal Processing
A Unified Primal-Dual Algorithm Framework Based on Bregman Iteration
Journal of Scientific Computing
A note on element-wise matrix sparsification via a matrix-valued Bernstein inequality
Information Processing Letters
Uniqueness of Low-Rank Matrix Completion by Rigidity Theory
SIAM Journal on Matrix Analysis and Applications
A Singular Value Thresholding Algorithm for Matrix Completion
SIAM Journal on Optimization
A linear solution to 1-dimensional subspace fitting under incomplete data
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part II
Robust photometric stereo via low-rank matrix completion and recovery
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part III
Robust principal component analysis?
Journal of the ACM (JACM)
International Journal of Sensor Networks
The minimum-rank gram matrix completion via modified fixed point continuation method
Proceedings of the 36th international symposium on Symbolic and algebraic computation
Rank aggregation via nuclear norm minimization
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Estimation of missing RTTs in computer networks: Matrix completion vs compressed sensing
Computer Networks: The International Journal of Computer and Telecommunications Networking
Learning the states: a brain inspired neural model
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Multilinear model estimation with L2-regularization
DAGM'11 Proceedings of the 33rd international conference on Pattern recognition
Distributed rating prediction in user generated content streams
Proceedings of the fifth ACM conference on Recommender systems
Transactional Database Transformation and Its Application in Prioritizing Human Disease Genes
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Trace Norm Regularization: Reformulations, Algorithms, and Multi-Task Learning
SIAM Journal on Optimization
Alternating Direction Algorithms for $\ell_1$-Problems in Compressive Sensing
SIAM Journal on Scientific Computing
Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
SIAM Journal on Optimization
Identifying users from their rating patterns
Proceedings of the 2nd Challenge on Context-Aware Movie Recommendation
A modified parallel optimization system for updating large-size time-evolving flow matrix
Information Sciences: an International Journal
Structural analysis of network traffic matrix via relaxed principal component pursuit
Computer Networks: The International Journal of Computer and Telecommunications Networking
A Simpler Approach to Matrix Completion
The Journal of Machine Learning Research
Randomized Algorithms for Matrices and Data
Foundations and Trends® in Machine Learning
A manifold flattening approach for anchorless localization
Wireless Networks
Beating randomized response on incoherent matrices
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
Clustered embedding of massive social networks
Proceedings of the 12th ACM SIGMETRICS/PERFORMANCE joint international conference on Measurement and Modeling of Computer Systems
TILT: Transform Invariant Low-Rank Textures
International Journal of Computer Vision
Neural Networks
Accelerated singular value thresholding for matrix completion
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Optimal exact least squares rank minimization
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Low rank modeling of signed networks
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Semi-supervised learning with mixed knowledge information
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
SIAM Journal on Optimization
Robust Video Restoration by Joint Sparse and Low Rank Matrix Approximation
SIAM Journal on Imaging Sciences
Low-Rank Matrix Approximation with Weights or Missing Data Is NP-Hard
SIAM Journal on Matrix Analysis and Applications
Inferring visibility: who's (not) talking to whom?
Proceedings of the ACM SIGCOMM 2012 conference on Applications, technologies, architectures, and protocols for computer communication
Sampling methods for the Nyström method
The Journal of Machine Learning Research
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
ISCO'12 Proceedings of the Second international conference on Combinatorial Optimization
Spatio-temporal compressive sensing and internet traffic matrices
IEEE/ACM Transactions on Networking (TON)
A fast tri-factorization method for low-rank matrix recovery and completion
Pattern Recognition
Inferring visibility: who's (not) talking to whom?
ACM SIGCOMM Computer Communication Review - Special october issue SIGCOMM '12
Learning spectral embedding via iterative eigenvalue thresholding
Proceedings of the 21st ACM international conference on Information and knowledge management
On traffic matrix completion in the internet
Proceedings of the 2012 ACM conference on Internet measurement conference
Repairing sparse low-rank texture
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part V
Active subspace: Toward scalable low-rank learning
Neural Computation
Expanders, tropical semi-rings, and nuclear norms: oh my!
XRDS: Crossroads, The ACM Magazine for Students - Scientific Computing
Accelerated Linearized Bregman Method
Journal of Scientific Computing
Dimensionality reduction by low-rank embedding
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
On convex quadratic programs with linear complementarity constraints
Computational Optimization and Applications
Semi-supervised learning with nuclear norm regularization
Pattern Recognition
Advances in Computational Mathematics
Beyond worst-case analysis in private singular vector computation
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Low-rank matrix completion using alternating minimization
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Robust image annotation via simultaneous feature and sample outlier pursuit
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Iterative reweighted algorithms for matrix rank minimization
The Journal of Machine Learning Research
Fine-grained semi-supervised labeling of large shape collections
ACM Transactions on Graphics (TOG)
Privacy-preserving matrix factorization
Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security
Recovering low-rank matrices from corrupted observations via the linear conjugate gradient algorithm
Journal of Computational and Applied Mathematics
Software for weighted structured low-rank approximation
Journal of Computational and Applied Mathematics
Shifted subspaces tracking on sparse outlier for motion segmentation
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Social trust prediction using rank-k matrix recovery
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Multireference alignment using semidefinite programming
Proceedings of the 5th conference on Innovations in theoretical computer science
Approximation of rank function and its application to the nearest low-rank correlation matrix
Journal of Global Optimization
DMFSGD: a decentralized matrix factorization algorithm for network distance prediction
IEEE/ACM Transactions on Networking (TON)
Matrix Recipes for Hard Thresholding Methods
Journal of Mathematical Imaging and Vision
Consistent shape maps via semidefinite programming
SGP '13 Proceedings of the Eleventh Eurographics/ACMSIGGRAPH Symposium on Geometry Processing
Hi-index | 0.12 |
We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries selected uniformly at random from a matrix M. Can we complete the matrix and recover the entries that we have not seen? We show that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries. We prove that if the number m of sampled entries obeys $$m\ge C\,n^{1.2}r\log n$$ for some positive numerical constant C, then with very high probability, most n×n matrices of rank r can be perfectly recovered by solving a simple convex optimization program. This program finds the matrix with minimum nuclear norm that fits the data. The condition above assumes that the rank is not too large. However, if one replaces the 1.2 exponent with 1.25, then the result holds for all values of the rank. Similar results hold for arbitrary rectangular matrices as well. Our results are connected with the recent literature on compressed sensing, and show that objects other than signals and images can be perfectly reconstructed from very limited information.