Comparison-based adaptive strategy selection with bandits in differential evolution

  • Authors:
  • Álvaro Fialho;Raymond Ros;Marc Schoenauer;Michèle Sebag

  • Affiliations:
  • Microsoft Research, INRIA Joint Centre, Orsay, France;INRIA Saclay - Île-de-France & LRI, UMR CNRS, Orsay, France;Microsoft Research, INRIA Joint Centre, Orsay, France and INRIA Saclay - Île-de-France & LRI, UMR CNRS, Orsay, France;Microsoft Research, INRIA Joint Centre, Orsay, France and INRIA Saclay - Île-de-France & LRI, UMR CNRS, Orsay, France

  • Venue:
  • PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Differential Evolution is a popular powerful optimization algorithm for continuous problems. Part of its efficiency comes from the availability of several mutation strategies that can (and must) be chosen in a problem-dependent way. However, such flexibility also makes DE difficult to be automatically used in a new context. F-AUC-Bandit is a comparison-based Adaptive Operator Selection method that has been proposed in the GA framework. It is used here for the on-line control of DE mutation strategy, thus preserving DE invariance w.r.t. monotonous transformations of the objective function. The approach is comparatively assessed on the BBOB test suite, demonstrating significant improvement on baseline and other Adaptive Strategy Selection approaches, while presenting a very low sensitivity to hyper-parameter setting.