Advanced Algorithms and Operators
Advanced Algorithms and Operators
Evolving neural networks through augmenting topologies
Evolutionary Computation
Efficient huge-scale feature selection with speciated genetic algorithm
Pattern Recognition Letters
Improving evolvable hardware by applying the speciation technique
Applied Soft Computing
Discovering several robot behaviors through speciation
Evo'08 Proceedings of the 2008 conference on Applications of evolutionary computing
Evolving artificial neural network ensembles
IEEE Computational Intelligence Magazine
On the role of population size and niche radius in fitness sharing
IEEE Transactions on Evolutionary Computation
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
Learning of dynamic BNN toward storing-and-stabilizing periodic patterns
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
Recently, evolutionary neural networks are hot topics in a neural network community because of their flexibility and good performance. However, they suffer from a premature convergence problem caused by the genetic drift of evolutionary algorithms. The genetic diversity in a population decreases quickly and it loses an exploration capability. Based on the inspiration of diversity in nature, a number of speciation algorithms are proposed to maintain diverse solutions from the population. One problem arising from this approach is lack of information on the distance measures among neural networks to penalize or discard similar solutions. In this paper, a comparison is conducted for six distance measures (genotypic, phenotypic, and behavioral types) with representative speciation algorithms (fitness sharing and deterministic crowding genetic algorithms) on six UCI benchmark datasets. It shows that the choice of distance measures is important in the neural network evolution.