Levels of abstraction in modeling and sampling: the feature-based Bayesian optimization algorithm

  • Authors:
  • Moshe Looks

  • Affiliations:
  • Washington University in St. Louis, St. Louis, MO

  • Venue:
  • Proceedings of the 8th annual conference on Genetic and evolutionary computation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

I introduce a generalization of probabilistic modeling and sampling for estimation of distribution algorithms (EDAs), that allows models to contain features, additional level(s) of abstraction defined in terms of the problem's base-level variables. I demonstrate how a simple feature class, variable-position motifs within fixed-length strings, may be exploited by a powerful EDA, the Bayesian optimization algorithm (BOA). Experimental results are presented where motifs are learned autonomously via a simple heuristic. The effectiveness of this feature-based BOA is demonstrated across a range of problems where such motifs are relevant.