An extension of hill-climbing with learning applied to a symbolic regression of boolean functions

  • Authors:
  • Vladimír Kvasnička;Ladislav Clementis;Jiří Pospíchal

  • Affiliations:
  • Slovak University of Technology, Bratislava, Slovakia;Slovak University of Technology, Bratislava, Slovakia;Slovak University of Technology, Bratislava, Slovakia

  • Venue:
  • Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we discuss an application of simple stochastic optimization algorithm called the hill climbing with learning (HCwL) for a study of symbolic regression. A fundamental role in this approach plays the so-called probability vector w = (w1, w2, ..., wn) where an entry 0 ≤ w+i ≤1 specifies a probability that an i-th component of solution (e. g. a bit in binary representation) has a binary 1 value. An integral part of HCwL is a mutation process, where from a current solution xold is created a new solution xnew by a stochastic mutation process. The used probability vector w (considered here as a special type of collective memory) serves as an auxiliary device for a construction of new mutated solution xnew; in particular, it predicts promising directions during its creation that are specified by the previous history of adaptation process.