Evolving neural networks through augmenting topologies
Evolutionary Computation
A Taxonomy for artificial embryogeny
Artificial Life
Evolving modular genetic regulatory networks
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
Compositional pattern producing networks: A novel abstraction of development
Genetic Programming and Evolvable Machines
A novel generative encoding for exploiting neural network sensor and output geometry
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Picbreeder: evolving pictures collaboratively online
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A case study on the critical role of geometric regularity in machine learning
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Competitive coevolution through evolutionary complexification
Journal of Artificial Intelligence Research
HyperNEAT controlled robots learn how to drive on roads in simulated environment
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Evolving coordinated quadruped gaits with the HyperNEAT generative encoding
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Enhancing es-hyperneat to evolve more complex regular neural networks
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Scalable neuroevolution for reinforcement learning
TPNC'12 Proceedings of the First international conference on Theory and Practice of Natural Computing
Hi-index | 0.01 |
The Hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) approach demonstrated that the pattern of weights across the connectivity of an artificial neural network (ANN) can be generated as a function of its geometry, thereby allowing large ANNs to be evolved for high-dimensional problems. Yet it left to the user the question of where hidden nodes should be placed in a geometry that is potentially infinitely dense. To relieve the user from this decision, this paper introduces an extension called evolvable-substrate HyperNEAT (ES-HyperNEAT) that determines the placement and density of the hidden nodes based on a quadtree-like decomposition of the hypercube of weights and a novel insight about the relationship between connectivity and node placement. The idea is that the representation in HyperNEAT that encodes the pattern of connectivity across the ANN contains implicit information on where the nodes should be placed and can therefore be exploited to avoid the need to evolve explicit placement. In this paper, as a proof of concept, ES-HyperNEAT discovers working placements of hidden nodes for a simple navigation domain on its own, thereby eliminating the need to configure the HyperNEAT substrate by hand and suggesting the potential power of the new approach.