Constrained global optimization: algorithms and applications
Constrained global optimization: algorithms and applications
Recent developments and trends in global optimization
Journal of Computational and Applied Mathematics - Special issue on numerical analysis 2000 Vol. IV: optimization and nonlinear equations
Computational Experience with a New Class of Convex Underestimators: Box-constrained NLP Problems
Journal of Global Optimization
Journal of Global Optimization
The design of the Boost interval arithmetic library
Theoretical Computer Science - Real numbers and computers
Deterministic Global Optimization: Theory, Methods and (NONCONVEX OPTIMIZATION AND ITS APPLICATIONS Volume 37) (Nonconvex Optimization and Its Applications)
Introduction to Global Optimization (Nonconvex Optimization and Its Applications)
Introduction to Global Optimization (Nonconvex Optimization and Its Applications)
Global optimization of signomial mixed-integer nonlinear programming problems with free variables
Journal of Global Optimization
Some transformation techniques with applications in global optimization
Journal of Global Optimization
Convex underestimation strategies for signomial functions
Optimization Methods & Software - GLOBAL OPTIMIZATION
Exponential and power transformations for convexifying signomial terms in MINLP problems
MIC '08 Proceedings of the 27th IASTED International Conference on Modelling, Identification and Control
An Efficient Global Approach for Posynomial Geometric Programming Problems
INFORMS Journal on Computing
Global solution of optimization problems with signomial parts
Discrete Optimization
Hi-index | 0.00 |
In this paper, we present a global optimization method for solving nonconvex mixed integer nonlinear programming (MINLP) problems. A convex overestimation of the feasible region is obtained by replacing the nonconvex constraint functions with convex underestimators. For signomial functions single-variable power and exponential transformations are used to obtain the convex underestimators. For more general nonconvex functions two versions of the so-called 驴BB-underestimator, valid for twice-differentiable functions, are integrated in the actual reformulation framework. However, in contrast to what is done in branch-and-bound type algorithms, no direct branching is performed in the actual algorithm. Instead a piecewise convex reformulation is used to convexify the entire problem in an extended variable-space, and the reformulated problem is then solved by a convex MINLP solver. As the piecewise linear approximations are made finer, the solution to the convexified and overestimated problem will form a converging sequence towards a global optimal solution. The result is an easily-implementable algorithm for solving a very general class of optimization problems.