Derivatives of spectral functions
Mathematics of Operations Research
Twice Differentiable Spectral Functions
SIAM Journal on Matrix Analysis and Applications
Convex Optimization
Full regularization path for sparse principal component analysis
Proceedings of the 24th international conference on Machine learning
Trust-Region Methods on Riemannian Manifolds
Foundations of Computational Mathematics
A geometric newton method for oja's vector field
Neural Computation
Optimization Algorithms on Matrix Manifolds
Optimization Algorithms on Matrix Manifolds
A Path Following Algorithm for the Graph Matching Problem
IEEE Transactions on Pattern Analysis and Machine Intelligence
Regression on Fixed-Rank Positive Semidefinite Matrices: A Riemannian Approach
The Journal of Machine Learning Research
A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations
SIAM Journal on Matrix Analysis and Applications
Online learning in the embedded manifold of low-rank matrices
The Journal of Machine Learning Research
Low-rank quadratic semidefinite programming
Neurocomputing
Hi-index | 0.01 |
We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization $X=YY^T$, where the number of columns of $Y$ fixes an upper bound on the rank of the positive semidefinite matrix $X$. It is thus very effective for solving problems that have a low-rank solution. The factorization $X=YY^T$ leads to a reformulation of the original problem as an optimization on a particular quotient manifold. The present paper discusses the geometry of that manifold and derives a second-order optimization method with guaranteed quadratic convergence. It furthermore provides some conditions on the rank of the factorization to ensure equivalence with the original problem. In contrast to existing methods, the proposed algorithm converges monotonically to the sought solution. Its numerical efficiency is evaluated on two applications: the maximal cut of a graph and the problem of sparse principal component analysis.