An efficient algorithm for large scale global optimization of continuous functions

  • Authors:
  • Yong-Jun Wang;Jiang-She Zhang

  • Affiliations:
  • Faculty of Science, Xi'an Jiaotong University, Xi'an 710049, China;Faculty of Science, Xi'an Jiaotong University, Xi'an 710049, China

  • Venue:
  • Journal of Computational and Applied Mathematics
  • Year:
  • 2007

Quantified Score

Hi-index 7.29

Visualization

Abstract

A fast descent algorithm, resorting to a ''stretching'' function technique and built on one hybrid method (GRSA) which combines simulated annealing (SA) algorithm and gradient based methods for large scale global optimizations, is proposed. Unlike the previously proposed method in which the original objective functions remain unchanged during the whole course of optimization, the new method firstly constructs an auxiliary function on one local minimizer obtained by gradient based methods and then SA is executed on this constructed auxiliary function instead of on the original objective function in order that we can improve the jumping ability of SA algorithm to escape from the currently discovered local minimum to a better one from which the gradient based methods restart a new local search. The above procedure is repeated until a global minimum is detected. In addition, corresponding to the adopted ''stretching'' technique, a new next trial point generating scheme is designed. It is verified by simulation especially on large scale problems that the convergence speed is greatly accelerated, which is its main difference from many other reported methods that mostly cope with functions with less than 50 variables and does not apply to large scale optimization problems. Furthermore, the new algorithm functions as a global optimization procedure with a high success probability and high solution precision.