The Minimization of Semicontinuous Functions: Mollifier Subgradients
SIAM Journal on Control and Optimization
Approximating Subdifferentials by Random Sampling of Gradients
Mathematics of Operations Research
Pseudospectral Components and the Distance to Uncontrollability
SIAM Journal on Matrix Analysis and Applications
A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
SIAM Journal on Optimization
Mesh Adaptive Direct Search Algorithms for Constrained Optimization
SIAM Journal on Optimization
Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
SIAM Journal on Optimization
Using Sampling and Simplex Derivatives in Pattern Search Methods
SIAM Journal on Optimization
Nonsmooth optimization through Mesh Adaptive Direct Search and Variable Neighborhood Search
Journal of Global Optimization
OrthoMADS: A Deterministic MADS Instance with Orthogonal Directions
SIAM Journal on Optimization
Hi-index | 0.00 |
We give a nonderivative version of the gradient sampling algorithm of Burke, Lewis, and Overton for minimizing a locally Lipschitz function $f$ on $\mathbb{R}^n$ that is continuously differentiable on an open dense subset. Instead of gradients of $f$, we use estimates of gradients of the Steklov averages of $f$ (obtained by convolution with mollifiers) which require $f$-values only. We show that the nonderivative version retains the convergence properties of the gradient sampling algorithm. In particular, with probability 1, it either drives the $f$-values to $-\infty$ or each of its cluster points is Clarke stationary for $f$.