Evolving dynamical neural networks for adaptive behavior
Adaptive Behavior
Incremental evolution of complex general behavior
Adaptive Behavior - Special issue on environment structure and behavior
Evolutionary robotics and the radical envelope-of-noise hypothesis
Adaptive Behavior
Evolutionary neurocontrollers for autonomous mobile robots
Neural Networks - Special issue on neural control and robotics: biology and technology
Considerations in the application of evolution to the generation of robot controllers
Information Sciences—Informatics and Computer Science: An International Journal - Special issue on evolutionary algorithms
Developing Mobile Robot Wall-Following Algorithms Using Genetic Programming
Applied Intelligence
Hi-index | 0.00 |
When attempting to evolve models of the real world through the information obtained by interacting with it, we always come across the same problem: the fitness function for the problem, that is, the real world, can only be known in a sampled manner. In this article we study the effect of different parameters and algorithmic strategies when working with sampled fitness functions in evolutionary processes. The results presented here correspond to a study of the effect of the size of the Short Term Memory and the number of generations between updates for a generic genetic algorithm that will be operating within the Multilevel Darwinist Brain. From these results, several critical points may be considered in order to define the limits to which computations can be simplified when working with sampled fitness functions while maintaining the same representational power. We provide a study of different proposals for the construction of Short Term Memories and their replacement strategies in order to obtain the maximum information with the minimum use of resources.