Algorithm 909: NOMAD: Nonlinear Optimization with the MADS Algorithm
ACM Transactions on Mathematical Software (TOMS)
Combined heat and power economic dispatch by mesh adaptive direct search algorithm
Expert Systems with Applications: An International Journal
A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
SIAM Journal on Optimization
Control of dead-time systems using derivative free particle swarm optimisation
International Journal of Bio-Inspired Computation
ICCSA'12 Proceedings of the 12th international conference on Computational Science and Its Applications - Volume Part III
A derivative-free approximate gradient sampling algorithm for finite minimax problems
Computational Optimization and Applications
ICCSA'13 Proceedings of the 13th international conference on Computational Science and Its Applications - Volume 1
Hi-index | 0.00 |
In this paper, we introduce ways of making a pattern search more efficient by reusing previous evaluations of the objective function, based on the computation of simplex derivatives (e.g., simplex gradients). At each iteration, one can attempt to compute an accurate simplex gradient by identifying a sampling set of previously evaluated points with good geometrical properties. This can be done using only past successful iterates or by considering all past function evaluations. The simplex gradient can then be used to reorder the evaluations of the objective function associated with the directions used in the poll step or to update the mesh size parameter according to a sufficient decrease criterion, neither of which requires new function evaluations. We present these procedures in detail and apply them to a set of problems from the CUTEr collection. Numerical results show that these procedures can enhance significantly the practical performance of pattern search methods.