Randomized algorithms
Evaluating evolutionary algorithms
Artificial Intelligence - Special volume on empirical methods
Drift analysis and average time complexity of evolutionary algorithms
Artificial Intelligence
On the analysis of the (1+ 1) evolutionary algorithm
Theoretical Computer Science
On the Optimization of Monotone Polynomials by Simple Randomized Search Heuristics
Combinatorics, Probability and Computing
The Cooperative Coevolutionary (1+1) EA
Evolutionary Computation
A Blend of Markov-Chain and Drift Analysis
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
The benefit of migration in parallel evolutionary algorithms
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Proceedings of the 12th annual conference on Genetic and evolutionary computation
General lower bounds for the running time of evolutionary algorithms
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
Drift analysis with tail bounds
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
Theory of Randomized Search Heuristics: Foundations and Recent Developments
Theory of Randomized Search Heuristics: Foundations and Recent Developments
Crossover speeds up building-block assembly
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Fixed budget computations: a different perspective on run time analysis
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Separable functions are composed of subfunctions that depend on mutually disjoint sets of bits. These subfunctions can be optimized independently, however in black-box optimization this direct approach is infeasible as the composition of subfunctions may be unknown. Common belief is that evolutionary algorithms make progress on all subfunctions in parallel, so that optimizing a separable function does not take not much longer than optimizing the hardest subfunction---subfunctions are optimized "in parallel." We show that this is only partially true, already for the simple (1+1) evolutionary algorithm ((1+1)EA). For separable functions composed of k Boolean functions indeed the optimization time is the maximum optimization time of these functions times a small(log k) overhead. More generally, for sums of weighted subfunctions that each attain non negative integer values less than r = o(log 1/2 n), we get an overhead of O(r log n). However, the hoped for parallel optimization behavior does not always come true. We present a separable function with k ≤ √ n subfunctions such that the (1+1)EA is likely to optimize many subfunctions sequentially. The reason is that standard mutation leads to interferences between search processes on different subfunctions. Under mild assumptions, we show that such a sequential optimization behavior is worst possible.