Mirror descent and nonlinear projected subgradient methods for convex optimization

  • Authors:
  • Amir Beck;Marc Teboulle

  • Affiliations:
  • School of Mathematical Sciences, Tel-Aviv University, Ramat-Aviv 69978, Israel;School of Mathematical Sciences, Tel-Aviv University, Ramat-Aviv 69978, Israel

  • Venue:
  • Operations Research Letters
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The mirror descent algorithm (MDA) was introduced by Nemirovsky and Yudin for solving convex optimization problems. This method exhibits an efficiency estimate that is mildly dependent in the decision variables dimension, and thus suitable for solving very large scale optimization problems. We present a new derivation and analysis of this algorithm. We show that the MDA can be viewed as a nonlinear projected-subgradient type method, derived from using a general distance-like function instead of the usual Euclidean squared distance. Within this interpretation, we derive in a simple way convergence and efficiency estimates. We then propose an Entropic mirror descent algorithm for convex minimization over the unit simplex, with a global efficiency estimate proven to be mildly dependent in the dimension of the problem.