Topics in matrix analysis
Krylov subspace methods for solving large Lyapunov equations
SIAM Journal on Numerical Analysis
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
A Cyclic Low-Rank Smith Method for Large Sparse Lyapunov Equations
SIAM Journal on Scientific Computing
Solution of the matrix equation AX + XB = C [F4]
Communications of the ACM
The ubiquitous Kronecker product
Journal of Computational and Applied Mathematics - Special issue on numerical analysis 2000 Vol. III: linear algebra
Matrix algorithms
Low-Rank Solution of Lyapunov Equations
SIAM Review
Approximation of Large-Scale Dynamical Systems (Advances in Design and Control) (Advances in Design and Control)
Mathematical Programming: Series A and B
Trust-Region Methods on Riemannian Manifolds
Foundations of Computational Mathematics
A New Iterative Method for Solving Large-Scale Lyapunov Matrix Equations
SIAM Journal on Scientific Computing
Dynamical Low-Rank Approximation
SIAM Journal on Matrix Analysis and Applications
A Multigrid Method to Solve Large Scale Sylvester Equations
SIAM Journal on Matrix Analysis and Applications
Alternating Projections on Manifolds
Mathematics of Operations Research
A geometric newton method for oja's vector field
Neural Computation
Optimization Algorithms on Matrix Manifolds
Optimization Algorithms on Matrix Manifolds
Krylov Subspace Methods for Linear Systems with Tensor Product Structure
SIAM Journal on Matrix Analysis and Applications
Low-Rank Optimization on the Cone of Positive Semidefinite Matrices
SIAM Journal on Optimization
Online learning in the embedded manifold of low-rank matrices
The Journal of Machine Learning Research
Hi-index | 0.00 |
We propose a new framework based on optimization on manifolds to approximate the solution of a Lyapunov matrix equation by a low-rank matrix. The method minimizes the error on the Riemannian manifold of symmetric positive semidefinite matrices of fixed rank. We detail how objects from differential geometry, like the Riemannian gradient and Hessian, can be efficiently computed for this manifold. As a minimization algorithm we use the Riemannian trust-region method of [P.-A. Absil, C. Baker, and K. Gallivan, Found. Comput. Math., 7 (2007), pp. 303-330] based on a second-order model of the objective function on the manifold. Together with an efficient preconditioner, this method can find low-rank solutions with very little memory. We illustrate our results with numerical examples.