Extending Selection Learning toward Fixed-Length d-Ary Strings

  • Authors:
  • Arnaud Berny

  • Affiliations:
  • -

  • Venue:
  • Selected Papers from the 5th European Conference on Artificial Evolution
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

The aim of this paper is to extend selection learning, initially designed for the optimization of real functions over fixed-length binary strings, toward fixed-length strings on an arbitrary finite alphabet. We derive selection learning algorithms from clear principles. First, we are looking for product probability measures over d-ary strings, or equivalently, random variables whose components are statistically independent. Second, these distributions are evaluated relatively to the expectation of the fitness function. More precisely, we consider the logarithm of the expectation to introduce fitness proportional and Boltzmann selections. Third, we define two kinds of gradient systems to maximize the expectation. The first one drives unbounded parameters, whereas the second one directly drives probabilities, 脿 la PBIL. We also introduce composite selection, that is algorithms which take into account positively as well as negatively selected strings. We propose stochastic approximations for the gradient systems, and finally, we apply three of the resulting algorithms to two test functions, OneMax and BigJump, and draw some conclusions on their relative strengths and weaknesses.