Simulated Annealing for Convex Optimization

  • Authors:
  • Adam Tauman Kalai;Santosh Vempala

  • Affiliations:
  • Toyota Technological Institute at Chicago, 1427 East 60th Street, Second Floor, Chicago, Illinois 60637;Department of Mathematics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139

  • Venue:
  • Mathematics of Operations Research
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We apply the method known as simulated annealing to the following problem in convex optimization: Minimize a linear function over an arbitrary convex set, where the convex set is specified only by a membership oracle. Using distributions from the Boltzmann-Gibbs family leads to an algorithm that needs only O*(√n) phases for instances in Rn. This gives an optimization algorithm that makes O*(n4.5) calls to the membership oracle, in the worst case, compared to the previous best guarantee of O*(n5). The benefits of using annealing here are surprising because such problems have no local minima that are not also global minima. Hence, we conclude that one of the advantages of simulated annealing, in addition to avoiding poor local minima, is that in these problems it converges faster to the minima that it finds. We also give a proof that under certain general conditions, the Boltzmann-Gibbs distributions are optimal for annealing on these convex problems.