Constrained global optimization: algorithms and applications
Constrained global optimization: algorithms and applications
ACM Transactions on Mathematical Software (TOMS)
Pure adaptive search in Monte Carlo optimization
Mathematical Programming: Series A and B
Random tunneling by means of acceptance-rejection sampling for global optimization
Journal of Optimization Theory and Applications
On the convergence of the Baba and Dorea random optimization methods
Journal of Optimization Theory and Applications
An extended continuous Newton method
Journal of Optimization Theory and Applications
A collection of test problems for constrained global optimization algorithms
A collection of test problems for constrained global optimization algorithms
Recent advances in global optimization
Recent advances in global optimization
Topographical global optimization
Recent advances in global optimization
Pure adaptive search in global optimization
Mathematical Programming: Series A and B
Global minimization by reducing the duality gap
Mathematical Programming: Series A and B
Global Optimization for Neural Network Training
Computer - Special issue: neural computing: companion issue to Spring 1996 IEEE Computational Science & Engineering
Genetic Algorithms Plus Data Structures Equals Evolution Programs
Genetic Algorithms Plus Data Structures Equals Evolution Programs
Trace-Based Methods for Solving Nonlinear Global Optimization and Satisfiability Problems
Journal of Global Optimization
QMF Filter Bank Design by a New Global Optimization Method
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97)-Volume 3 - Volume 3
ICTAI '97 Proceedings of the 9th International Conference on Tools with Artificial Intelligence
Global optimization of nonconvex nonlinear programs using parallel branch and bound
Global optimization of nonconvex nonlinear programs using parallel branch and bound
Global search methods for solving nonlinear optimization problems
Global search methods for solving nonlinear optimization problems
A Global Optimization Algorithm using Lagrangian Underestimates and the Interval Newton Method
Journal of Global Optimization
Hi-index | 0.01 |
Lagrangian methods are popular in solving continuous constrained optimization problems. In this paper, we address three important issues in applying Lagrangian methods to solve optimization problems with inequality constraints.First, we study methods to transform inequality constraints into equality constraints. An existing method, called the slack-variable method, adds a slack variable to each inequality constraint in order to transform it into an equality constraint. Its disadvantage is that when the search trajectory is inside a feasible region, some satisfied constraints may still pose some effect on the Lagrangian function, leading to possible oscillations and divergence when a local minimum lies on the boundary of the feasible region. To overcome this problem, we propose the MaxQ method that carries no effect on satisfied constraints. Hence, minimizing the Lagrangian function in a feasible region always leads to a local minimum of the objective function. We also study some strategies to speed up its convergence.Second, we study methods to improve the convergence speed of Lagrangian methods without affecting the solution quality. This is done by an adaptive-control strategy that dynamically adjusts the relative weights between the objective and the Lagrangian part, leading to better balance between the two and faster convergence.Third, we study a trace-based method to pull the search trajectory from one saddle point to another in a continuous fashion without restarts. This overcomes one of the problems in existing Lagrangian methods that converges only to one saddle point and requires random restarts to look for new saddle points, often missing good saddle points in the vicinity of saddle points already found.Finally, we describe a prototype Novel (Nonlinear Optimization via External Lead) that implements our proposed strategies and present improved solutions in solving a collection of benchmarks.