A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix

  • Authors:
  • Houduo Qi;Defeng Sun

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Matrix Analysis and Applications
  • Year:
  • 2006

Quantified Score

Hi-index 0.02

Visualization

Abstract

The nearest correlation matrix problem is to find a correlation matrix which is closest to a given symmetric matrix in the Frobenius norm. The well-studied dual approach is to reformulate this problem as an unconstrained continuously differentiable convex optimization problem. Gradient methods and quasi-Newton methods such as BFGS have been used directly to obtain globally convergent methods. Since the objective function in the dual approach is not twice continuously differentiable, these methods converge at best linearly. In this paper, we investigate a Newton-type method for the nearest correlation matrix problem. Based on recent developments on strongly semismooth matrix valued functions, we prove the quadratic convergence of the proposed Newton method. Numerical experiments confirm the fast convergence and the high efficiency of the method.