Combinatorial optimization: algorithms and complexity
Combinatorial optimization: algorithms and complexity
Simulated annealing: theory and applications
Simulated annealing: theory and applications
A guided tour of Chernoff bounds
Information Processing Letters
An introduction to Kolmogorov complexity and its applications
An introduction to Kolmogorov complexity and its applications
Modern heuristic techniques for combinatorial problems
Evolutionary computation: toward a new philosophy of machine intelligence
Evolutionary computation: toward a new philosophy of machine intelligence
Randomized algorithms
Approximation algorithms for NP-hard problems
Approximation algorithms for NP-hard problems
A computational view of population genetics
Random Structures & Algorithms
Evolution and Optimum Seeking: The Sixth Generation
Evolution and Optimum Seeking: The Sixth Generation
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
A Probabilistic Algorithm for k-SAT and Constraint Satisfaction Problems
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
On the futility of blind search: An algorithmic view of “no free lunch”
Evolutionary Computation
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
On classes of functions for which No Free Lunch results hold
Information Processing Letters
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Optimization, block designs and no free lunch theorems
Information Processing Letters
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Proceedings of the 10th annual conference companion on Genetic and evolutionary computation
Focused no free lunch theorems
Proceedings of the 10th annual conference on Genetic and evolutionary computation
Iterative feature construction for improving inductive learning algorithms
Expert Systems with Applications: An International Journal
The impact of parametrization in memetic evolutionary algorithms
Theoretical Computer Science
Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers
Quality measures to adapt the participation in MOS
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Optimization, block designs and No Free Lunch theorems
Information Processing Letters
On the brittleness of evolutionary algorithms
FOGA'07 Proceedings of the 9th international conference on Foundations of genetic algorithms
No free lunch and free leftovers theorems for multiobjective optimisation problems
EMO'03 Proceedings of the 2nd international conference on Evolutionary multi-criterion optimization
Learning hybridization strategies in evolutionary algorithms
Intelligent Data Analysis
Free lunches on the discrete Lipschitz class
Theoretical Computer Science
Automatic modeling and usage of contextualized human behavior models
ICS'06 Proceedings of the 10th WSEAS international conference on Systems
Analysis of (1+1) evolutionary algorithm and randomized local search with memory
Evolutionary Computation
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Applied Soft Computing
Evolutionary Computation
Adaptive Memetic Differential Evolution with Global and Local neighborhood-based mutation operators
Information Sciences: an International Journal
Runtime analysis of the (1+1) EA on computing unique input output sequences
Information Sciences: an International Journal
Hi-index | 0.00 |
The No Free Lunch (NFL) theorem due to Wolpert and Macready (IEEE Trans. Evol. Comput. 1(1) (1997) 67) has led to controversial discussions on the usefulness of randomized search heuristics, in particular, evolutionary algorithms. Here a short and simple proof of the NFL theorem is given to show its elementary character. Moreover, the proof method leads to a generalization of the NFL theorem. Afterwards, realistic complexity theoretical-based scenarios for black box optimization are presented and it is argued why NFL theorems are not possible in such situations. However, an Almost No Free Lunch (ANFL) theorem shows that for each function which can be optimized efficiently by a search heuristic there can be constructed many related functions where the same heuristic is bad. As a consequence, search heuristics use some idea how to look for good points and can be successful only for functions "giving the right hints". The consequences of these theoretical considerations for some well-known classes of functions are discussed.