Effects of constant optimization by nonlinear least squares minimization in symbolic regression

  • Authors:
  • Michael Kommenda;Gabriel Kronberger;Stephan Winkler;Michael Affenzeller;Stefan Wagner

  • Affiliations:
  • University of Applied Sciences Upper Austria, Hagenberg, Austria;University of Applied Sciences Upper Austria, Hagenberg, Austria;University of Applied Sciences Upper Austria, Hagenberg, Austria;University of Applied Sciences Upper Austria, Hagenberg, Austria;University of Applied Sciences Upper Austria, Hagenberg, Austria

  • Venue:
  • Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this publication a constant optimization approach for symbolic regression is introduced to separate the task of finding the correct model structure from the necessity to evolve the correct numerical constants. A gradient-based nonlinear least squares optimization algorithm, the Levenberg-Marquardt (LM) algorithm, is used for adjusting constant values in symbolic expression trees during their evolution. The LM algorithm depends on gradient information consisting of partial derivations of the trees, which are obtained by automatic differentiation. The presented constant optimization approach is tested on several benchmark problems and compared to a standard genetic programming algorithm to show its effectiveness. Although the constant optimization involves an overhead regarding the execution time, the achieved accuracy increases significantly as well as the ability of genetic programming to learn from provided data. As an example, the Pagie-1 problem could be solved in 37 out of 50 test runs, whereas without constant optimization it was solved in only 10 runs. Furthermore, different configurations of the constant optimization approach (number of iterations, probability of applying constant optimization) are evaluated and their impact is detailed in the results section.