A Global Optimization Method, QBB, for Twice-Differentiable Nonconvex Optimization Problem

  • Authors:
  • Yushan Zhu;Takahito Kuno

  • Affiliations:
  • Aff1 Aff2;Institute of Information Sciences and Electronics, University of Tsukuba, Ibaraki, Japan 305-8573

  • Venue:
  • Journal of Global Optimization
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

A global optimization method, QBB, for twice-differentiable NLPs (Non-Linear Programming) is developed to operate within a branch-and-bound framework and require the construction of a relaxed convex problem on the basis of the quadratic lower bounding functions for the generic nonconvex structures. Within an exhaustive simplicial division of the constrained region, the rigorous quadratic underestimation function is constructed for the generic nonconvex function structure by virtue of the maximal eigenvalue analysis of the interval Hessian matrix. Each valid lower bound of the NLP problem with the division progress is computed by the convex programming of the relaxed optimization problem obtained by preserving the convex or linear terms, replacing the concave term with linear convex envelope, underestimating the special terms and the generic terms by using their customized tight convex lower bounding functions or the valid quadratic lower bounding functions, respectively. The standard convergence properties of the QBB algorithm for nonconvex global optimization problems are guaranteed. The preliminary computation studies are presented in order to evaluate the algorithmic efficiency of the proposed QBB approach.