Using previous models to bias structural learning in the hierarchical BOA

  • Authors:
  • Mark W. Hauschild;Martin Pelikan;Kumara Sastry;David E. Goldberg

  • Affiliations:
  • University of Missouri - St. Louis, St. Louis, MO, USA;University of Missouri - St. Louis, St. Louis, MO, USA;University of Illinois at Urbana-Champaign, Urbana-Champaign, IL, USA;University of Illinois at Urbana-Champaign, Urbana-Champaign, IL, USA

  • Venue:
  • Proceedings of the 10th annual conference on Genetic and evolutionary computation
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum (or an accurate approximation), any EDA also provides us with a sequence of probabilistic models, which hold a great deal of information about the problem. Although using problem-specific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of information has been largely ignored by the EDA community. This paper takes the first step towards the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in the future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous runs on similar problems. We show that the methods lead to substantial speedups and argue that they should work well in other applications that require solving a large number of problems with similar structure.