Hessian Matrix vs. Gauss-Newton Hessian Matrix

  • Authors:
  • Pei Chen

  • Affiliations:
  • chenpei@mail.sysu.edu.cn

  • Venue:
  • SIAM Journal on Numerical Analysis
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we investigate how the Gauss-Newton Hessian matrix affects the basin of convergence in Newton-type methods. Although the Newton algorithm is theoretically superior to the Gauss-Newton algorithm and the Levenberg-Marquardt (LM) method as far as their asymptotic convergence rate is concerned, the LM method is often preferred in nonlinear least squares problems in practice. This paper presents a theoretical analysis of the advantage of the Gauss-Newton Hessian matrix. It is proved that the Gauss-Newton approximation function is the only nonnegative convex quadratic approximation that retains a critical property of the original objective function: taking the minimal value of zero on an $(n-1)$-dimensional manifold (or affine subspace). Due to this property, the Gauss-Newton approximation does not change the zero-on-$(n-1)$-D “structure” of the original problem, explaining the reason why the Gauss-Newton Hessian matrix is preferred for nonlinear least squares problems, especially when the initial point is far from the solution.