Analysis of Multi-stage Convex Relaxation for Sparse Regularization
The Journal of Machine Learning Research
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
The Journal of Machine Learning Research
A Singular Value Thresholding Algorithm for Matrix Completion
SIAM Journal on Optimization
Camera calibration with lens distortion from low-rank textures
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
A non-convex relaxation approach to sparse dictionary learning
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
EP-GIG priors and applications in bayesian sparse learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
Motivated by the recent developments of nonconvex penalties in sparsity modeling, we propose a nonconvex optimization model for handing the low-rank matrix recovery problem. Different from the famous robust principal component analysis (RPCA), we suggest recovering low-rank and sparse matrices via a nonconvex loss function and a nonconvex penalty. The advantage of the nonconvex approach lies in its stronger robustness. To solve the model, we devise a majorization-minimization augmented Lagrange multiplier (MM-ALM) algorithm which finds the local optimal solutions of the proposed nonconvex model. We also provide an efficient strategy to speedup MM-ALM, which makes the running time comparable with the state-of-the-art algorithm of solving RPCA. Finally, empirical results demonstrate the superiority of our nonconvex approach over RPCA in terms of matrix recovery accuracy.