Optimization of control parameters for genetic algorithms
IEEE Transactions on Systems, Man and Cybernetics
Evolving programmers: the co-evolution of intelligent recombination operators
Advances in genetic programming
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Parallel Optimization of Evolutionary Algorithms
PPSN III Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature
A Hyperheuristic Approach to Scheduling a Sales Summit
PATAT '00 Selected papers from the Third International Conference on Practice and Theory of Automated Timetabling III
Investigations in meta-GAs: panaceas or pipe dreams?
GECCO '05 Proceedings of the 7th annual workshop on Genetic and evolutionary computation
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search
Operations Research
Opportunistic evolution: efficient evolutionary computation on large-scale computational grids
Proceedings of the 10th annual conference companion on Genetic and evolutionary computation
Pattern recognition and reading by machine
IRE-AIEE-ACM '59 (Eastern) Papers presented at the December 1-3, 1959, eastern joint IRE-AIEE-ACM computer conference
An experimental investigation of model-based parameter optimisation: SPO and beyond
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Distributed hyper-heuristics for real parameter optimization
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
A multi-level search framework for asynchronous cooperation of multiple hyper-heuristics
Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers
High performance ATP systems by combining several AI methods
IJCAI'97 Proceedings of the 15th international joint conference on Artifical intelligence - Volume 1
Simplifying Particle Swarm Optimization
Applied Soft Computing
Comparing parameter tuning methods for evolutionary algorithms
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Parallel global optimisation meta-heuristics using an asynchronous island-model
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
ParamILS: an automatic algorithm configuration framework
Journal of Artificial Intelligence Research
Heuristics for sampling repetitions in noisy landscapes with fitness caching
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Parameter control in evolutionary algorithms
IEEE Transactions on Evolutionary Computation
Design of evolutionary algorithms-A statistical perspective
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
Meta-optimization for parameter tuning with a flexible computing budget
Proceedings of the 14th annual conference on Genetic and evolutionary computation
A dynamic island model for adaptive operator selection
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Autoconstructive evolution for structural problems
Proceedings of the 14th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
Meta-evolutionary algorithms have long been proposed as an approach to automatically discover good parameter settings to use in later optimization runs. In this paper we instead ask whether a meta-evolutionary algorithm makes sense as an optimizer in its own right. That is, we're not interested in the resulting parameter settings, but only in the final result. As it so happens, this use of meta-EAs make sense in the context of large numbers of parallel runs, particularly in massive distributed scenarios. A primary issue facing meta-EAs is the stochastic nature of the meta-level fitness function. We consider whether this poses a challenge to establishing a gradient in the meta-level search space, and to what degree multiple tests are helpful in smoothing the noise. We discuss the nature of the meta-level search space and its impact on local optima, then examine the degree to which exploitation can be applied. We find that meta-EAs perform well as optimizers, and very surprisingly that they do best with only a single test. More exploitation appears to reduce performance, but only slightly.