A Class of Globally Convergent Optimization Methods Based on Conservative Convex Separable Approximations

  • Authors:
  • Krister Svanberg

  • Affiliations:
  • -

  • Venue:
  • SIAM Journal on Optimization
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper deals with a certain class of optimization methods, based on conservative convex separable approximations (CCSA), for solving inequality-constrained nonlinear programming problems. Each generated iteration point is a feasible solution with lower objective value than the previous one, and it is proved that the sequence of iteration points converges toward the set of Karush--Kuhn--Tucker points. A major advantage of CCSA methods is that they can be applied to problems with a very large number of variables (say 104--105) even if the Hessian matrices of the objective and constraint functions are dense.