Matrix analysis
A smoothing technique for nondifferentiable optimization problems
Proceedings of the international seminar on Optimization
On the convergence of the exponential multiplier method for convex programming
Mathematical Programming: Series A and B
Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices
SIAM Journal on Matrix Analysis and Applications
Matrix computations (3rd ed.)
Derivatives of spectral functions
Mathematics of Operations Research
Structural and Stability Properties of P0 Nonlinear Complementarity Problems
Mathematics of Operations Research
Beyond Monotonicity in Regularization Methods for Nonlinear Complementarity Problems
SIAM Journal on Control and Optimization
A Smoothing Newton Method for Extended Vertical Linear Complementarity Problems
SIAM Journal on Matrix Analysis and Applications
Twice Differentiable Spectral Functions
SIAM Journal on Matrix Analysis and Applications
Regularization of P0-Functions in Box Variational Inequality Problems
SIAM Journal on Optimization
SIAM Journal on Optimization
The $\U$-Lagrangian of the Maximum Eigenvalue Function
SIAM Journal on Optimization
Penalty and Barrier Methods: A Unified Framework
SIAM Journal on Optimization
A smoothing SQP method for nonlinear programs with stability constraints arising from power systems
Computational Optimization and Applications
Hi-index | 0.00 |
In this paper, we consider smooth convex approximations to the maximum eigenvalue function. To make it applicable to a wide class of applications, the study is conducted on the composite function of the maximum eigenvalue function and a linear operator mapping 驴m to $${\mathcal{S}}_n $$ , the space of n-by-n symmetric matrices. The composite function in turn is the natural objective function of minimizing the maximum eigenvalue function over an affine space in $${\mathcal{S}}_n $$ . This leads to a sequence of smooth convex minimization problems governed by a smoothing parameter. As the parameter goes to zero, the original problem is recovered. We then develop a computable Hessian formula of the smooth convex functions, matrix representation of the Hessian, and study the regularity conditions which guarantee the nonsingularity of the Hessian matrices. The study on the well-posedness of the smooth convex function leads to a regularization method which is globally convergent.