The Simple Genetic Algorithm: Foundations and Theory
The Simple Genetic Algorithm: Foundations and Theory
Upper and Lower Bounds for Randomized Search Heuristics in Black-Box Optimization
Theory of Computing Systems
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
An information perspective on evolutionary computation
Proceedings of the 9th annual conference companion on Genetic and evolutionary computation
An information perspective on evolutionary computation
Proceedings of the 10th annual conference companion on Genetic and evolutionary computation
Information Theoretic Classification of Problems for Metaheuristics
SEAL '08 Proceedings of the 7th International Conference on Simulated Evolution and Learning
An information perspective on evolutionary computation
Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers
Entropy profiles of ranked and random populations
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
In this paper we relate information theory and Kolmogorov Complexity (KC) to optimization in the black box scenario. We define the set of all possible decisions an algorithm might make during a run, we associate a function with a probability distribution over this set and define accordingly its entropy. We show that the expected KC of the set (rather than the function) is a better measure of problem difficulty. We analyze the effect of the entropy on the expected KC. Finally, we show, for a restricted scenario, that any permutation closure of a single function, the finest level of granularity for which a No Free Lunch Theorem can hold [7], can be associated with a particular value of entropy. This implies bounds on the expected performance of an algorithm on members of that closure.