A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations
SIAM Journal on Matrix Analysis and Applications
Projection-like Retractions on Matrix Manifolds
SIAM Journal on Optimization
Weak Sharp Minima on Riemannian Manifolds
SIAM Journal on Optimization
Hi-index | 0.00 |
This paper studies Newton-type methods for minimization of partly smooth convex functions. Sequential Newton methods are provided using local parameterizations obtained from **-Lagrangian theory and from Riemannian geometry. The Hessian based on the **-Lagrangian depends on the selection of a dual parameter g; by revealing the connection to Riemannian geometry, a natural choice of g emerges for which the two Newton directions coincide. This choice of g is also shown to be related to the least-squares multiplier estimate from a sequential quadratic programming (SQP) approach, and with this multiplier, SQP gives the same search direction as the Newton methods.