Compact representations as a search strategy: compression EDAs

  • Authors:
  • Marc Toussaint

  • Affiliations:
  • Institute for Adaptive and Neural Computation, University of Edinburgh, Edinburgh, Scotland, UK

  • Venue:
  • Theoretical Computer Science - Foundations of genetic algorithms
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The choice of representation crucially determines the capability of search processes to find complex solutions in which many variables interact. The question is how good representations can be found and how they can be adapted online to account for what can be learned about the structure of the problem from previous samples. We address these questions in a scenario that we term indirect Estimation-of-Distribution: We consider a decorrelated search distribution (mutational variability) on a variable length genotype space. A one-to-one encoding onto the phenotype space then needs to induce an adapted phenotypic search distribution incorporating the dependencies between phenotypic variables that have been observed successful previously. Formalizing this in the framework of Estimation-of-Distribution Algorithms, an adapted phenotypic search distribution can be characterized as minimizing the Kullback-Leibler divergence (KLD) to a population of previously selected samples (parents). The paper derives a relation between this KLD and the description length of the encoding, stating that compact representations provide a way to minimize the divergence. A proposed class of Compression Evolutionary Algorithms and experiments with an grammar-based compression scheme illustrate the new concept.