A multidimensional scaling approach for representing XML documents
ACM-SE 45 Proceedings of the 45th annual southeast regional conference
Correlation stress testing for value-at-risk: an unconstrained convex optimization approach
Computational Optimization and Applications
Computers & Mathematics with Applications
Calibrating Least Squares Semidefinite Programming with Equality and Inequality Constraints
SIAM Journal on Matrix Analysis and Applications
A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
SIAM Journal on Optimization
Block relaxation and majorization methods for the nearest correlation matrix with factor structure
Computational Optimization and Applications
Computing a Nearest Correlation Matrix with Factor Structure
SIAM Journal on Matrix Analysis and Applications
Solving Large-Scale Least Squares Semidefinite Programming by Alternating Direction Methods
SIAM Journal on Matrix Analysis and Applications
A projected semismooth Newton method for problems of calibrating least squares covariance matrix
Operations Research Letters
A Sequential Semismooth Newton Method for the Nearest Low-rank Correlation Matrix Problem
SIAM Journal on Optimization
Computing the Nondominated Surface in Tri-Criterion Portfolio Selection
Operations Research
Approximation of rank function and its application to the nearest low-rank correlation matrix
Journal of Global Optimization
Hi-index | 0.02 |
The nearest correlation matrix problem is to find a correlation matrix which is closest to a given symmetric matrix in the Frobenius norm. The well-studied dual approach is to reformulate this problem as an unconstrained continuously differentiable convex optimization problem. Gradient methods and quasi-Newton methods such as BFGS have been used directly to obtain globally convergent methods. Since the objective function in the dual approach is not twice continuously differentiable, these methods converge at best linearly. In this paper, we investigate a Newton-type method for the nearest correlation matrix problem. Based on recent developments on strongly semismooth matrix valued functions, we prove the quadratic convergence of the proposed Newton method. Numerical experiments confirm the fast convergence and the high efficiency of the method.