Evolutionary computation: toward a new philosophy of machine intelligence
Evolutionary computation: toward a new philosophy of machine intelligence
An Introduction to Genetic Algorithms
An Introduction to Genetic Algorithms
Journal of Global Optimization
SEPA: structure evolution and parameter adaptation in feed-forward neural networks
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Mutation-based genetic neural network
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The typical automatic way to search for optimal neural network is to combine structure evolution by evolutionary computation and weight adaptation by backpropagation. In this model, since structure and weight optimizations are carried out by two different algorithms each using its own search space, every change in network topology during structure evolution requires relearning of the entire weights by backpropagation. Because of this inefficiency, we propose that the evolution of network structure and weights shall be purely stochastic and tightly integrated such that good weights and structures are not relearned but propagated from generation to generation. Since this model does not depend on gradient information, the entire process allows more flexibility in the implementation of its evolution and in the formulation of its fitness function. This study demonstrates how invasive connectionist evolution can easily be implemented using particle swarm optimization (PSO), evolutionary programming (EP), and differential evolution (DE) with good performances in cancer and glass classification tasks.