Evolving artificial neural networks for nonlinear feature construction

  • Authors:
  • Tobias Berka;Helmut A. Mayer

  • Affiliations:
  • University of Cambridge & University of Salzburg, Cambridge, United Kingdom;University of Salzburg, Salzburg, Austria

  • Venue:
  • Proceedings of the 15th annual conference on Genetic and evolutionary computation
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We use neuroevolution to construct nonlinear transformation functions for feature construction that map points in the original feature space to augmented pattern vectors and improve the performance of generic classifiers. Our research demonstrates that we can apply evolutionary algorithms to both adapt the weights of a fully connected standard multi-layer perceptron (MLP), and optimize the topology of a generalized multi-layer perceptron (GMLP). The evaluation of the MLPs on four commonly used data sets shows an improvement in classification accuracy ranging from 4 to 13 percentage points over the performance on the original pattern set. The GMLPs obtain a slightly better accuracy and conserve 14% to 54% of all neurons and between 40% and 89% of all connections compared to the standard MLP.