Probabilistic modeling for continuous EDA with Boltzmann selection and Kullback-Leibeler divergence

  • Authors:
  • Cai Yunpeng;Sun Xiaomin;Jia Peifa

  • Affiliations:
  • Tsinghua University, Beijing, P.R. China;Tsinghua University, Beijing, P.R. China;Tsinghua University, Beijing, P.R. China

  • Venue:
  • Proceedings of the 8th annual conference on Genetic and evolutionary computation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper extends the Boltzmann Selection, a method in EDA with theoretical importance, from discrete domain to the continuous one. The difficulty of estimating the exact Boltzmann distribution in continuous state space is circumvented by adopting the multivariate Gaussian model, which is popular in continuous EDA, to approximate only the final sampling distribution. With the minimum Kullback-Leibeler divergence principle, both the mean vector and the covariance matrix of the Gaussian model can be calibrated to preserve the features of Boltzmann selection reflecting desired selection pressure. A method is proposed to adapt the selection pressure based on measuring the successfulness of the past evolution process. These works established a formal basis that helps to build probabilistic models in continuous EDA algorithms with adaptive parameters. The framework is incorporated in both the continuous UMDA and the EMNA algorithm, and tested in several benchmark problems. The experiment results are compared with some existing EDA versions and the benefit of the proposed approach is discussed.