Effects of speciation on evolution of neural networks in highly dynamic environments

  • Authors:
  • Peter Krčah

  • Affiliations:
  • Computer Center, Charles University, Prague 1, Czech Republic

  • Venue:
  • LION'12 Proceedings of the 6th international conference on Learning and Intelligent Optimization
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Using genetic algorithms for solving dynamic optimization problems is an important area of current research. In this work, we investigate effects of speciation in NeuroEvolution of Augmenting Topologies (NEAT), a well-known method for evolving neural network topologies, on problems with dynamic fitness function. NEAT uses speciation as a method of maintaining diversity in the population and protecting new solutions against competition. We show that NEAT outperforms non-speciated genetic algorithm (GA) not only on problems with static fitness function, but also on problems with gradually moving optimum. We also demonstrate that NEAT fails to achieve better performance on problems where the optimum moves rapidly. We propose a novel method called DynNEAT, which extends NEAT by changing the size of each species based on its historical performance. We demonstrate that DynNEAT outperforms both NEAT and non-speciated GA on problems with rapidly moving optimum, while it achieves performance similar to NEAT on problems with static or slowly moving optimum.