C4.5: programs for machine learning
C4.5: programs for machine learning
Inductive Learning of Mutation Step-Size in Evolutionary Parameter Optimization
EP '97 Proceedings of the 6th International Conference on Evolutionary Programming VI
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Proceedings of the 14th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
When evolutionary algorithms are applied to problems with computationally intensive fitness functions a limited budget of evaluations is usually available. For these types of problems minimizing the number of function evaluations becomes paramount, which can be achieved by using smaller population sizes and limiting the number of generations per run. Unfortunately this leads to a limited sampling of the problem space, which means finding adequate solutions is less likely. Evolutionary algorithms (EA) can be augmented with machine learners (ML) to more effectively explore the problem space. However, a "well-tuned" evolutionary algorithm strikes a balance between its constituent operators. Failure to do so could mean implementations that prematurely converge to inferior solutions or to not converge at all. One aspect of such "tuning" is the use of a proper selection pressure. Introducing a machine learner into an EA/ML hybrid introduces a new form of "emergent" selection pressure for which practitioners may need to compensate. This research shows two implementations of EA/ML hybrids that filter out inferior offspring based on knowledge inferred from better individuals have different emergent selection pressure characteristics.