Smoothing Technique and its Applications in Semidefinite Optimization

  • Authors:
  • Yurii Nesterov

  • Affiliations:
  • Catholic University of Louvain (UCL), Center for Operations Research and Econometrics (CORE), 34 voie du Roman Pays, 1348, Louvain-la-Neuve, Belgium

  • Venue:
  • Mathematical Programming: Series A and B
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we extend the smoothing technique (Nesterov in Math Program 103(1): 127–152, 2005; Nesterov in Unconstrained convex mimimization in relative scale, 2003) onto the problems of semidefinite optimization. For that, we develop a simple framework for estimating a Lipschitz constant for the gradient of some symmetric functions of eigenvalues of symmetric matrices. Using this technique, we can justify the Lipschitz constants for some natural approximations of maximal eigenvalue and the spectral radius of symmetric matrices. We analyze the efficiency of the special gradient-type schemes on the problems of minimizing the maximal eigenvalue or the spectral radius of the matrix, which depends linearly on the design variables. We show that in the first case the number of iterations of the method is bounded by $$O({1}/{\epsilon})$$, where $$\epsilon$$ is the required absolute accuracy of the problem. In the second case, the number of iterations is bounded by $${({4}/{\delta})} \sqrt{(1 + \delta) r\, \ln r }$$, where δ is the required relative accuracy and r is the maximal rank of corresponding linear matrix inequality. Thus, the latter method is a fully polynomial approximation scheme.