Diversity loss in general estimation of distribution algorithms

  • Authors:
  • Jonathan L. Shapiro

  • Affiliations:
  • University of Manchester, Manchester, UK

  • Venue:
  • PPSN'06 Proceedings of the 9th international conference on Parallel Problem Solving from Nature
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A very general class of EDAs is defined, on which universal results on the rate of diversity loss can be derived. This EDA class, denoted SML-EDA, requires two restrictions: 1) in each generation, the new probability model is build using only data sampled from the current probability model; and 2) maximum likelihood is used to set model parameters. This class is very general; it includes simple forms of many well-known EDAs, e.g. BOA, MIMIC, FDA, UMDA, etc. To study the diversity loss in SML-EDAs, the trace of the empirical covariance matrix is the proposed statistic. Two simple results are derived. Let N be the number of data vectors evaluated in each generation. It is shown that on a flat landscape, the expected value of the statistic decreases by a factor 1–1/N in each generation. This result is used to show that for the Needle problem, the algorithm will with a high probability never find the optimum unless the population size grows exponentially in the number of search variables.