Evolving neural networks through augmenting topologies
Evolutionary Computation
A Taxonomy for artificial embryogeny
Artificial Life
Evolving modular genetic regulatory networks
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
Compositional pattern producing networks: A novel abstraction of development
Genetic Programming and Evolvable Machines
Picbreeder: evolving pictures collaboratively online
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A case study on the critical role of geometric regularity in machine learning
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Competitive coevolution through evolutionary complexification
Journal of Artificial Intelligence Research
Evolving coordinated quadruped gaits with the HyperNEAT generative encoding
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Evolving the placement and density of neurons in the hyperneat substrate
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Investigating whether hyperNEAT produces modular neural networks
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Proceedings of the 14th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
The recently-introduced evolvable-substrate HyperNEAT algorithm (ES-HyperNEAT) demonstrated that the placement and density of hidden nodes in an artificial neural network can be determined based on implicit information in an infinite-resolution pattern of weights, thereby avoiding the need to evolve explicit placement. However, ES-HyperNEAT is computationally expensive because it must search the entire hypercube, and was shown only to match the performance of the original HyperNEAT in a simple benchmark problem. Iterated ES-HyperNEAT, introduced in this paper, helps to reduce computational costs by focusing the search on a sequence of two-dimensional cross-sections of the hypercube and therefore makes possible searching the hypercube at a finer resolution. A series of experiments and an analysis of the evolved networks show for the first time that iterated ES-HyperNEAT not only matches but outperforms original HyperNEAT in more complex domains because ES-HyperNEAT can evolve networks with limited connectivity, elaborate on existing network structure, and compensate for movement of information within the hypercube.