Comparison of multistart global optimization algorithms on the BBOB noiseless testbed

  • Authors:
  • László Pál

  • Affiliations:
  • Sapientia - Hungarian University of Transylvania, Miercurea-Ciuc, Romania

  • Venue:
  • Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi Level Single Linkage is a multistart, stochastic global optimization method which relies on random sampling and local search. In this paper, we benchmarked three variants of the MLSL algorithm by using two gradient based and a derivative-free local search method on the noiseless function testbed. The three methods were also compared with a commercial multistart solver, called OQNLP (OptQuest/NLP). Our experiment showed that, the results may be influenced essentially by the applied local search procedure. Depending of the type of the problem the gradient based local search methods are faster in the initial stage of the optimization, while the derivative-free method show a superior performance in the final phase for moderate dimensions. Considering the percentage of the solved problems, OQNLP is similar or even better (for multi-modal and weakly structured functions) in 5-D than the MLSL method equipped with the gradient type local search methods, while on 20-D the latter algorithms are usually more faster.