Exploiting second order information in computational multi-objective evolutionary optimization

  • Authors:
  • Pradyumn Kumar Shukla

  • Affiliations:
  • Institute for Numerical Mathematics, Dresden University of Technology, Dresden, Germany

  • Venue:
  • EPIA'07 Proceedings of the aritficial intelligence 13th Portuguese conference on Progress in artificial intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Evolutionary algorithms are efficient population based algorithms for solving multi-objective optimization problems. Recently various authors have discussed the efficacy of combining gradient based classical methods with evolutionary algorithms. This is done since gradient information leads to convergence to Pareto-optimal solutions with a linear convergence rate. However none of existing studies have explored how to exploit second order or Hessian information in evolutionary multiobjective algorithms. Second order information though costly, leads to a quadratic convergence to Pareto-optimal solutions. In this paper, we take Levenberg-Marquardt methods from classical optimization and show two possible ways of hybrid algorithms. These algorithms require gradient and Hessian information which is obtained using finite difference techniques. Computational studies on a number of test problems of varying complexity demonstrate the efficiency of resulting hybrid algorithms in solving a large class of complex multi-objective optimization problems.