Neural and automata networks: dynamical behavior and applications
Neural and automata networks: dynamical behavior and applications
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Advances in neural information processing systems 2
Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms
Hyperedge Replacement: Grammars and Languages
Hyperedge Replacement: Grammars and Languages
Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control and Artificial Intelligence
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
What Makes a Problem GP-Hard? Analysis of a Tunably Difficult Problem in Genetic Programming
Genetic Programming and Evolvable Machines
Lamarckian Evolution, The Baldwin Effect and Function Optimization
PPSN III Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature
Proceedings of the European Conference on Genetic Programming
Speeding up backpropagation using multiobjective evolutionary algorithms
Neural Computation
Dynamic maximum tree depth: a simple technique for avoiding bloat in tree-based GP
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
A developmental method for growing graphs and circuits
ICES'03 Proceedings of the 5th international conference on Evolvable systems: from biology to hardware
Automated synthesis of passive analog filters using graph representation
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
The global dynamics of automata networks (such as neural networks) are a function of their topology and the choice of automata used. Evolutionary methods can be applied to the optimisation of these parameters, but their computational cost is prohibitive unless they operate on a compact representation. Graph grammars provide such a representation by allowing network regularities to be efficiently captured and reused. We present a system for encoding and evolving automata networks as collective hypergraph grammars, and demonstrate its efficacy on the classical problems of symbolic regression and the design of neural network architectures.