Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
On the futility of blind search: An algorithmic view of “no free lunch”
Evolutionary Computation
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Determination and the no-free-lunch paradox
Neural Computation
Hi-index | 0.00 |
Permutations can represent search problems when all points in the search space have unique evaluations. Given a particular set of N evaluations we have N! search algorithms and N! possible functions. A general No Free Lunch result holds for this finite set of N! functions. Furthermore, it is proven that the average description length over the set of N! functions must be O(N lg N). Thus if the size of the search space is exponentially large with respect to a parameter set which specifies a point in the search space, then the description length of the set of N! functions must also be exponential on average. Summary statistics are identical for all instances of the set of N! functions, including mean, variance, skew and other r-moment statistics. These summary statistics can be used to show that any set of N! functions must obey a set of identical constraints which holds over the set of Walsh coefficients. This also imposes mild constraints on schema information for the set of N! functions. When N = 2L subsets of the N! functions are related via Gray codes which partition N! into equivalence classes of size 2L.