A study of multiobjective metaheuristics when solving parameter scalable problems

  • Authors:
  • Juan J. Durillo;Antonio J. Nebro;Carlos A. Coello Coello;José García-Nieto;Francisco Luna;Enrique Alba

  • Affiliations:
  • Departamento de Lenguajes y Ciencias de la Computación, University of Málaga, Málaga, Spain;Departamento de Lenguajes y Ciencias de la Computación, University of Málaga, Málaga, Spain;Department of Computer Science, Center of Research and Advanced Studies, Mexico DF, Mexico and Unité Mixte Internationale-Laboratoire Franco-Mexicaine d'Inforrnatique et Automatique, CNRS, Ce ...;Departamento de Lenguajes y Ciencias de la Computación, University of Málaga, Málaga, Spain;Departamento de Lenguajes y Ciencias de la Computación, University of Málaga, Málaga, Spain;Departamento de Lenguajes y Ciencias de la Computación, University of Málaga, Málaga, Spain

  • Venue:
  • IEEE Transactions on Evolutionary Computation
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

To evaluate the search capabilities of a multiobjective algorithm, the usual approach is to choose a benchmark of known problems, to perform a fixed number of function evaluations, and to apply a set of quality indicators. However, while real problems could have hundreds or even thousands of decision variables, current benchmarks are normally adopted with relatively few decision variables (normally from 10 to 30). Furthermore, performing a constant number of evaluations does not provide information about the effort required by an algorithm to get a satisfactory set of solutions; this information would also be of interest in real scenarios, where evaluating the functions defining the problem can be computationally expensive. In this paper, we study the effect of parameter scalability in a number of state-of-the-art multiobjective metaheuristics. We adopt a benchmark of parameter-wise scalable problems (the Zitzler-Deb-Thiele test suite) and analyze the behavior of eight multiobjective metaheuristics on these test problems when using a number of decision variables that range from 8 up to 2048. By using the hypervolume indicator as a stopping condition, we also analyze the computational effort required by each algorithm in order to reach the Pareto front. We conclude that the two analyzed algorithms based on particle swarm optimization and differential evolution yield the best overall results.