Smooth Convex Approximation to the Maximum Eigenvalue Function
Journal of Global Optimization
Distributed average consensus with least-mean-square deviation
Journal of Parallel and Distributed Computing
Consistency of Trace Norm Minimization
The Journal of Machine Learning Research
Sensitivity analysis of the strain criterion for multidimensional scaling
Computational Statistics & Data Analysis
Second-order directional derivatives of spectral functions
Computers & Mathematics with Applications
Low-Rank Optimization on the Cone of Positive Semidefinite Matrices
SIAM Journal on Optimization
Hi-index | 0.00 |
A function F on the space of n × n real symmetric matrices is called spectral if it depends only on the eigenvalues of its argument. Spectral functions are just symmetric functions of the eigenvalues. We show that a spectral function is twice (continuously) differentiable at a matrix if and only if the corresponding symmetric function is twice (continuously) differentiable at the vector of eigenvalues. We give a concise and usable formula for the Hessian.