Self-adaptive inexact proximal point methods

  • Authors:
  • William W. Hager;Hongchao Zhang

  • Affiliations:
  • Department of Mathematics, University of Florida, Gainesville, USA 32611-8105;Institute for Mathematics and Its Applications (IMA), University of Minnesota, Minneapolis, USA 55455-0436

  • Venue:
  • Computational Optimization and Applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a class of self-adaptive proximal point methods suitable for degenerate optimization problems where multiple minimizers may exist, or where the Hessian may be singular at a local minimizer. If the proximal regularization parameter has the form $\mu({\bf{x}})=\beta\|\nabla f({\bf{x}})\|^{\eta}$ where 驴驴[0,2) and β0 is a constant, we obtain convergence to the set of minimizers that is linear for 驴=0 and β sufficiently small, superlinear for 驴驴(0,1), and at least quadratic for 驴驴[1,2). Two different acceptance criteria for an approximate solution to the proximal problem are analyzed. These criteria are expressed in terms of the gradient of the proximal function, the gradient of the original function, and the iteration difference. With either acceptance criterion, the convergence results are analogous to those of the exact iterates. Preliminary numerical results are presented using some ill-conditioned CUTE test problems.