Mesh Adaptive Direct Search Algorithms for Constrained Optimization

  • Authors:
  • Charles Audet;J. E. Dennis, Jr.

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Optimization
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper addresses the problem of minimization of a nonsmooth function under general nonsmooth constraints when no derivatives of the objective or constraint functions are available. We introduce the mesh adaptive direct search (MADS) class of algorithms which extends the generalized pattern search (GPS) class by allowing local exploration, called polling, in an asymptotically dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained.The main GPS convergence result is to identify limit points $\hat{x}$, where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions span the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many linear constraints for GPS. The main result of this paper is that the general MADS framework is flexible enough to allow the generation of an asymptotically dense set of refining directions along which the Clarke derivatives are nonnegative. We propose an instance of MADS for which the refining directions are dense in the hypertangent cone at $\hat{x}$ with probability 1 whenever the iterates associated with the refining directions converge to a single $\hat{x}$. The instance of MADS is compared to versions of GPS on some test problems. We also illustrate the limitation of our results with examples.