On the second eigenvalue of random regular graphs
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
The Expected Norm of Random Matrices
Combinatorics, Probability and Computing
Fast monte-carlo algorithms for finding low-rank approximations
Journal of the ACM (JACM)
Spectral techniques applied to sparse random graphs
Random Structures & Algorithms
Fast computation of low-rank matrix approximations
Journal of the ACM (JACM)
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
The Journal of Machine Learning Research
ADMiRA: atomic decomposition for minimum rank approximation
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Recovering the missing components in a large noisy low-rank matrix: application to SFM
IEEE Transactions on Pattern Analysis and Machine Intelligence
Matrix Completion from Noisy Entries
The Journal of Machine Learning Research
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
The Journal of Machine Learning Research
Robust principal component analysis?
Journal of the ACM (JACM)
International Journal of Sensor Networks
Proceedings of the ACM SIGMETRICS joint international conference on Measurement and modeling of computer systems
ACM SIGMETRICS Performance Evaluation Review - Performance evaluation review
Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
SIAM Journal on Optimization
Identifying users from their rating patterns
Proceedings of the 2nd Challenge on Context-Aware Movie Recommendation
Exact matrix completion via convex optimization
Communications of the ACM
A Simpler Approach to Matrix Completion
The Journal of Machine Learning Research
Beating randomized response on incoherent matrices
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
Accelerated singular value thresholding for matrix completion
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
SIAM Journal on Optimization
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
Learning spectral embedding via iterative eigenvalue thresholding
Proceedings of the 21st ACM international conference on Information and knowledge management
Efficient crowdsourcing for multi-class labeling
Proceedings of the ACM SIGMETRICS/international conference on Measurement and modeling of computer systems
Low-rank matrix completion using alternating minimization
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Matrix Recipes for Hard Thresholding Methods
Journal of Mathematical Imaging and Vision
Hi-index | 754.85 |
Let M be an nα×n matrix of rank r, and assume that a uniformly random subset E of its entries is observed. We describe an efficient algorithm, which we call OPTSPACE, that reconstructs M from |E| = O(r n) observed entries with relative root mean square error RMSE ≤ C(α)(nr/|E|)1/2 with probability larger than 1 - 1/n3. Further, if r = O(1) and M is sufficiently unstructured, then OPTSPACE reconstructs it exactly from |E| = O(n log n) entries with probability larger than 1 - 1/n3. This settles (in the case of bounded rank) a question left open by Candès and Recht and improves over the guarantees for their reconstruction algorithm. The complexity of our algorithm is O(|E|r log n), which opens the way to its use for massive data sets. In the process of proving these statements, we obtain a generalization of a celebrated result by Friedman-Kahn-Szemerédi and Feige-Ofek on the spectrum of sparse random matrices.