A new global optimization method for univariate constrained twice-differentiable NLP problems

  • Authors:
  • Min Ho Chang;Young Cheol Park;Tai-Yong Lee

  • Affiliations:
  • Department of Chemical and Biomolecular Engineering, Korea Advanced Institute of Science and Technology, Daejeon, South Korea 305-701;Department of Chemical and Biomolecular Engineering, Korea Advanced Institute of Science and Technology, Daejeon, South Korea 305-701;Department of Chemical and Biomolecular Engineering, Korea Advanced Institute of Science and Technology, Daejeon, South Korea 305-701

  • Venue:
  • Journal of Global Optimization
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a new global optimization method is proposed for an optimization problem with twice-differentiable objective and constraint functions of a single variable. The method employs a difference of convex underestimator and a convex cut function, where the former is a continuous piecewise concave quadratic function, and the latter is a convex quadratic function. The main objectives of this research are to determine a quadratic concave underestimator that does not need an iterative local optimizer to determine the lower bounding value of the objective function and to determine a convex cut function that effectively detects infeasible regions for nonconvex constraints. The proposed method is proven to have a finite 驴-convergence to locate the global optimum point. The numerical experiments indicate that the proposed method competes with another covering method, the index branch-and-bound algorithm, which uses the Lipschitz constant.