Convex Optimization
Local Minima and Convergence in Low-Rank Semidefinite Programming
Mathematical Programming: Series A and B
Fast maximum margin matrix factorization for collaborative prediction
ICML '05 Proceedings of the 22nd international conference on Machine learning
Collaborative prediction using ensembles of Maximum Margin Matrix Factorizations
ICML '06 Proceedings of the 23rd international conference on Machine learning
Restricted Boltzmann machines for collaborative filtering
Proceedings of the 24th international conference on Machine learning
Consistency of Trace Norm Minimization
The Journal of Machine Learning Research
Convex multi-task feature learning
Machine Learning
An accelerated gradient method for trace norm minimization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Scalable Collaborative Filtering Approaches for Large Recommender Systems
The Journal of Machine Learning Research
A New Approach to Collaborative Filtering: Operator Estimation with Spectral Regularization
The Journal of Machine Learning Research
Exact Matrix Completion via Convex Optimization
Foundations of Computational Mathematics
Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
SIAM Journal on Matrix Analysis and Applications
The power of convex relaxation: near-optimal matrix completion
IEEE Transactions on Information Theory
Matrix completion from a few entries
IEEE Transactions on Information Theory
Fixed point and Bregman iterative methods for matrix rank minimization
Mathematical Programming: Series A and B
Matrix completion from a few entries
IEEE Transactions on Information Theory
Kernel-Mapping Recommender system algorithms
Information Sciences: an International Journal
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
A probabilistic approach to robust matrix factorization
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part VII
Optimizing over the growing spectrahedron
ESA'12 Proceedings of the 20th Annual European conference on Algorithms
Multi-source learning with block-wise missing data for Alzheimer's disease prediction
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Multisample aCGH Data Analysis via Total Variation and Spectral Regularization
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Global analytic solution of fully-observed variational Bayesian matrix factorization
The Journal of Machine Learning Research
Iterative reweighted algorithms for matrix rank minimization
The Journal of Machine Learning Research
Nonconvex relaxation approaches to robust matrix recovery
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.06 |
We use convex relaxation techniques to provide a sequence of regularized low-rank solutions for large-scale matrix completion problems. Using the nuclear norm as a regularizer, we provide a simple and very efficient convex algorithm for minimizing the reconstruction error subject to a bound on the nuclear norm. Our algorithm SOFT-IMPUTE iteratively replaces the missing elements with those obtained from a soft-thresholded SVD. With warm starts this allows us to efficiently compute an entire regularization path of solutions on a grid of values of the regularization parameter. The computationally intensive part of our algorithm is in computing a low-rank SVD of a dense matrix. Exploiting the problem structure, we show that the task can be performed with a complexity of order linear in the matrix dimensions. Our semidefinite-programming algorithm is readily scalable to large matrices; for example SOFT-IMPUTE takes a few hours to compute low-rank approximations of a 106 X 106 incomplete matrix with 107 observed entries, and fits a rank-95 approximation to the full Netflix training set in 3.3 hours. Our methods achieve good training and test errors and exhibit superior timings when compared to other competitive state-of-the-art techniques.