Optimization by Stochastic Continuation

  • Authors:
  • Marc C. Robini;Isabelle E. Magnin

  • Affiliations:
  • marc.robini@creatis.insa-lyon.fr and isabelle.magnin@creatis.insa-lyon.fr;-

  • Venue:
  • SIAM Journal on Imaging Sciences
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Simulated annealing (SA) and deterministic continuation are well-known generic approaches to global optimization. Deterministic continuation is computationally attractive but produces suboptimal solutions, whereas SA is asymptotically optimal but converges very slowly. In this paper, we introduce a new class of hybrid algorithms which combines the theoretical advantages of SA with the practical advantages of deterministic continuation. We call this class of algorithms stochastic continuation (SC). In a nutshell, SC is a variation of SA in which both the energy function and the communication mechanism are allowed to be time-dependent. We first prove that SC inherits the convergence properties of generalized SA under weak assumptions. Then, we show that SC can be successfully applied to optimization issues raised by the Bayesian approach to signal reconstruction. The considered class of energy functions arises in maximum a posteriori estimation with a Markov random field prior. The associated minimization task is NP-hard and beyond the scope of popular methods such as loopy belief propagation, tree-reweighted message passing, and graph cuts and its extensions. We perform numerical experiments in the context of three-dimensional reconstruction from a very limited number of projections; our results show that SC can substantially outperform both deterministic continuation and SA.