Effective Fitness as an Alternative Paradigm for Evolutionary Computation I: General Formalism
Genetic Programming and Evolvable Machines
Using Optimal Dependency-Trees for Combinational Optimization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Complexity Compression and Evolution
Proceedings of the 6th International Conference on Genetic Algorithms
Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
Redundant representations in evolutionary computation
Evolutionary Computation
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Linkage Problem, Distribution Estimation, and Bayesian Networks
Evolutionary Computation
Efficient Linkage Discovery by Limited Probing
Evolutionary Computation
Identifying hierarchical structure in sequences: a linear-time algorithm
Journal of Artificial Intelligence Research
Hierarchical BOA solves ising spin glasses and MAXSAT
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
Demonstrating the evolution of complex genetic representations: an evolution of artificial plants
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
Representation development from pareto-coevolution
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
The minimum description length principle in coding and modeling
IEEE Transactions on Information Theory
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
Information geometry on hierarchy of probability distributions
IEEE Transactions on Information Theory
Factorial representations to generate arbitrary search distributions
GECCO '05 Proceedings of the 7th annual workshop on Genetic and evolutionary computation
How an optimal observer can collapse the search space
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Conquering hierarchical difficulty by explicit chunking: substructural chromosome compression
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Overcoming hierarchical difficulty by hill-climbing the building block structure
Proceedings of the 9th annual conference on Genetic and evolutionary computation
On the performance effects of unbiased module encapsulation
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Dependency structure matrix, genetic algorithms, and effective recombination
Evolutionary Computation
Variable transformations in estimation of distribution algorithms
PPSN'12 Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part I
Hi-index | 0.00 |
The choice of genetic representation crucially determines the capability of evolutionary processes to find complex solutions in which many variables interact. The question is how good genetic representations can be found and how they can be adapted online to account for what can be learned about the structure of the problem from previous samples. We address these questions in a scenario that we term indirect Estimation-of-Distribution: We consider a decorrelated search distribution (mutational variability) on a variable length genotype space. A one-to-one encoding onto the phenotype space then needs to induce an adapted phenotypic variability incorporating the dependencies between phenotypic variables that have been observed successful previously. Formalizing this in the framework of Estimation-of-Distribution Algorithms, an adapted phenotypic variability can be characterized as minimizing the Kullback-Leibler divergence to a population of previously selected individuals (parents). Our core result is a relation between the Kullback-Leibler divergence and the description length of the encoding in the specific scenario, stating that compact codes provide a way to minimize this divergence. A proposed class of Compression Evolutionary Algorithms and preliminary experiments with an L-system compression scheme illustrate the approach. We also discuss the implications for the self-adaptive evolution of genetic representations on the basis of neutrality (σ-evolution) towards compact codes.