Generating large-scale neural networks through discovering geometric regularities

  • Authors:
  • Jason Gauci;Kenneth Stanley

  • Affiliations:
  • University of Central Florida, Orlando, FL;University of Central Florida, Orlando, FL

  • Venue:
  • Proceedings of the 9th annual conference on Genetic and evolutionary computation
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Connectivity patterns in biological brains exhibit many repeating motifs. This repetition mirrors inherent geometric regularities in the physical world. For example, stimuli that excite adjacent locations on the retina map to neurons that are similarly adjacent in the visual cortex. That way, neural connectivity can exploit geometric locality in the outside world by employing local connections in the brain. If such regularities could be discovered by methods that evolve artificial neural networks (ANNs), then they could be similarly exploited to solve problems that would otherwise require optimizing too many dimensions to solve. This paper introduces such a method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain. Connective CPPNs encode connectivity patterns as concepts that are independent of the number of inputs or outputs, allowing functional large-scale neural networks to be evolved. In this paper, this approach is tested in a simple visual task for which it effectively discovers the correct underlying regularity, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections.