Min-max and robust polynomial optimization

  • Authors:
  • J. B. Lasserre

  • Affiliations:
  • LAAS-CNRS and Institute of Mathematics, LAAS, Toulouse Cédex 4, France 31077

  • Venue:
  • Journal of Global Optimization
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the robust (or min-max) optimization problem $$J^*:=\max_{\mathbf{y}\in{\Omega}}\min_{\mathbf{x}}\{f(\mathbf{x},\mathbf{y}): (\mathbf{x},\mathbf{y})\in\mathbf{\Delta}\}$$ where f is a polynomial and $${\mathbf{\Delta}\subset\mathbb{R}^n\times\mathbb{R}^p}$$ as well as $${{\Omega}\subset\mathbb{R}^p}$$ are compact basic semi-algebraic sets. We first provide a sequence of polynomial lower approximations $${(J_i)\subset\mathbb{R}[\mathbf{y}]}$$ of the optimal value function $${J(\mathbf{y}):=\min_\mathbf{x}\{f(\mathbf{x},\mathbf{y}): (\mathbf{x},\mathbf{y})\in \mathbf{\Delta}\}}$$ . The polynomial $${J_i\in\mathbb{R}[\mathbf{y}]}$$ is obtained from an optimal (or nearly optimal) solution of a semidefinite program, the ith in the "joint + marginal" hierarchy of semidefinite relaxations associated with the parametric optimization problem $${\mathbf{y}\mapsto J(\mathbf{y})}$$ , recently proposed in Lasserre (SIAM J Optim 20, 1995-2022, 2010). Then for fixed i, we consider the polynomial optimization problem $${J^*_i:=\max\nolimits_{\mathbf{y}}\{J_i(\mathbf{y}):\mathbf{y}\in{\Omega}\}}$$ and prove that $${\hat{J}^*_i(:=\displaystyle\max\nolimits_{\ell=1,\ldots,i}J^*_\ell)}$$ converges to J* as i 驴 驴. Finally, for fixed 驴 驴 i, each $${J^*_\ell}$$ (and hence $${\hat{J}^*_i}$$ ) can be approximated by solving a hierarchy of semidefinite relaxations as already described in Lasserre (SIAM J Optim 11, 796---817, 2001; Moments, Positive Polynomials and Their Applications. Imperial College Press, London 2009).