Parallel Genetic Algorithms Population Genetics and Combinatorial Optimization
Proceedings of the 3rd International Conference on Genetic Algorithms
PEPNet: Parallel Evolutionary Programming for Constructing Artificial Neural Networks
EP '97 Proceedings of the 6th International Conference on Evolutionary Programming VI
A Comparative Study of Five Parallel Genetic Algorithms using the Traveling Salesman Problem
IPPS '98 Proceedings of the 12th. International Parallel Processing Symposium on International Parallel Processing Symposium
(R) A Study of a Non-Linear Optimization Problem Using a Distributed Genetic Algorithm
ICPP '96 Proceedings of the Proceedings of the 1996 International Conference on Parallel Processing - Volume 2
Time-series forecasting using flexible neural tree model
Information Sciences: an International Journal
AIA'06 Proceedings of the 24th IASTED international conference on Artificial intelligence and applications
Soft-Computing: mit Neuronalen Netzen, Fuzzy-Logic und Evolutionären Algorithmen (eXamen.press)
Soft-Computing: mit Neuronalen Netzen, Fuzzy-Logic und Evolutionären Algorithmen (eXamen.press)
Parameter control in evolutionary algorithms
IEEE Transactions on Evolutionary Computation
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper we study how the parallelization of a learning algorithm affects the generalization ability of Evolutionary Artificial Neural Networks (EANNs). The newly proposed evolutionary algorithm (EA), which improves chromosomes according to characteristics of their genotype and phenotype, was used for evolving ANNs. The EA has been parallelized by two schemes: the migration approach, which periodically exchanges the best individuals between all parallel populations, and the recently developed migration-strangers strategy, which extends search space during evolution by the replacement of worst chromosomes in parallel populations with the randomly generated new ones, called strangers. The experiments have been provided on the Mackey-Glass chaotic time series problem in order to determine the best and the average prediction errors on training and testing data for small and large ANNs, evolved by both parallel evolutionary algorithms (PEAs). The results showed that PEAs enable to produce compact ANNs with high precision of prediction, which have insignificant distinctions between training and testing errors.