The effects of training set size and keeping rules on the emergent selection pressure of learnable evolution model

  • Authors:
  • Mark Coletti

  • Affiliations:
  • George Mason University, Washington, USA

  • Venue:
  • Proceedings of the 14th annual conference companion on Genetic and evolutionary computation
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Evolutionary algorithms with computationally expensive fitness evaluations typically have smaller evaluation budgets and population sizes. However, smaller populations and fewer evaluations mean that the problem space may not be effectively explored. An evolutionary algorithm may be combined with a machine learner to compensate for these smaller populations and evaluations to increase the likelihood of finding viable solutions. Learnable Evolution Model (LEM) is such an evolutionary algorithm (EA) and machine learner (ML) hybrid that infers rules from best- and least-fit individuals and then exploits these rules when creating offspring. This paper shows that LEM introduces a unique form of emergent selection pressure that is separate from any selection pressure induced by parent or survivor selection. Additionally this work shows that this selection pressure can be attenuated by how the best and least fit subsets are chosen, and by how long learned rules are kept. Practitioners need to be aware of this novel form of selection pressure and these means of adjusting it to ensure their LEM implementations are adequately tuned. That is, too much selection pressure may mean premature convergence to inferior solutions while insufficient selection pressure may mean no sufficient solutions are found.