Enhanced simulated annealing for globally minimizing functions of many-continuous variables

  • Authors:
  • Patrick Siarry;Gérard Berthiau;François Durdin;Jacques Haussy

  • Affiliations:
  • Ecole Centrale de Paris, Paris, France;C.E.A., France;C.E.A., France;C.E.A., France

  • Venue:
  • ACM Transactions on Mathematical Software (TOMS)
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new global optimization algorithm for functions of many continuous variables is presented, derived from the basic Simulated annealing method. Our main contribution lies in dealing with high-dimensionality minimization problems, which are often difficult to solve by all known minimization methods with or without gradient. In this article we take a special interest in the variables discretization issue. We also develop and implement several complementary stopping criteria. The original Metropolis iterative random search, which takes place in a Euclidean space Rn, is replaced by another similar exploration, performed within a succession of Euclidean spaces Rp, with p n. This Enhanced Simulated Annealing (ESA) algorithm was validated first on classical highly multimodal functions of 2 to 100 variables. We obtained significant reductions in the number of function evaluations compared to six other global optimization algorithms, selected according to previously published computational results for the same set of test functions. In most cases, ESA was able to closely approximate known global optima. The reduced ESA computational cost helped us to refine further the obtained global results, through the use of some local search. We have used this new minimizing procedure to solve complex circuit design problems, for which the objective function evaluation can be exceedingly costly.