The evolution of evolvability in genetic programming
Advances in genetic programming
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms
Proceedings of the 6th International Conference on Genetic Algorithms
The relationship between evolvability and bloat
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Evolvability and speed of evolutionary algorithms in light of recent developments in biology
Journal of Artificial Evolution and Applications
Hi-index | 0.00 |
This paper proposes a new paradigm, referred to as Recurrent Genetic Algorithms (RGA), to sustain Genetic Algorithm (GA) evolvability and effectively improves its ability to find superior solutions. RGA attempts to continually recover evolvability loss caused by the canonical GA iteration process. It borrows the term Recurrent from the taxonomy of Neural Networks (NN), in which a Recurrent NN (RNN) is a special type of network that uses a feedback loop, usually to account for temporal information embedded in the sequence of data points presented to the network. Unlike RNN, the temporal dimension in our algorithm pertains to the sequential nature of the evolution process itself; and not to the data sampled from the problem solution space. Empirical evidence shows that the new algorithm better preserves the population's diversity, higher number of constructive crossovers and mutations. Furthermore, evidence shows that the RGA outperforms the standard GA on two NP problems and does the same on three continuous optimisation problems when aided by problem encoding information.