Adding learning to the cellular development of neural networks: Evolution and the baldwin effect

  • Authors:
  • Frédéric Gruau;Darrell Whitley

  • Affiliations:
  • Départment de Recherche Fondamentale, Matiere Condensée, CEA CENG, SP2M PSC BP 85X 38041 Grenoble, France gruau@drfmc.ceng.cea.fr;Computer Science Department, Colorado State University Fort Collins, CO 80523 whitley@cs.colostate.edu

  • Venue:
  • Evolutionary Computation
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

A grammar tree is used to encode a cellular developmental process that can generate whole families of Boolean neural networks for computing parity and symmetry. The development process resembles biological cell division. A genetic algorithm is used to find a grammar tree that yields both architecture and weights specifying a particular neural network for solving specific Boolean functions. The current study particularly focuses on the addition of learning to the development process and the evolution of grammar trees. Three ways of adding learning to the development process are explored. Two of these exploit the Baldwin effect by changing the fitness landscape without using Lamarckian evolution. The third strategy is Lamarckian in nature. Results for these three modes of combining learning with genetic search are compared against genetic search without learning. Our results suggest that merely using learning to change the fitness landscape can be as effective as Lamarckian strategies at improving search.