the future of experimental research

  • Authors:
  • Thomas Bartz-Beielstein;Mike Preuss

  • Affiliations:
  • Cologne University of Applied Sciences, Cologne, Germany;TU Dortmund University, Dortmund, Germany

  • Venue:
  • Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is an open secret that the performance of algorithms depends on their parameterizations --- and of the parameterizations of the problem instances. However, these dependencies can be seen as means for understanding algorithm's behavior. Based on modern statistical techniques we demonstrate how to tune and understand algorithms. We present a comprehensive, effective and very efficient methodology for the design and experimental analysis of direct search techniques such as evolutionary algorithms, differential evolution, pattern search or even classical deterministic methods such as the Nelder-Mead simplex algorithm. Our approach extends the sequential parameter optimization (SPO) method that has been successfully applied as a tuning procedure to numerous heuristics for practical and theoretical optimization problems. Optimization practitioners receive valuable hints for choosing an adequate heuristic for their optimization problems---theoreticians receive guidelines for testing results systematically on real problem instances. Based on several examples from theory and practice we demonstrate how SPO improves the performance of many search heuristics significantly. However, this performance gain is not available for free. Therefore, costs of this tuning process are discussed as well as its limitations and a number of currently unresolved open issues in experimental research on algorithms.