Lindenmayer systems, fractals and plants
Lindenmayer systems, fractals and plants
Using Optimal Dependency-Trees for Combinational Optimization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
Generative representations for evolutionary design automation
Generative representations for evolutionary design automation
On the complexity of hierarchical problem solving
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
Demonstrating the evolution of complex genetic representations: an evolution of artificial plants
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
Representation development from pareto-coevolution
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
Compact genetic codes as a search strategy of evolutionary processes
FOGA'05 Proceedings of the 8th international conference on Foundations of Genetic Algorithms
Compact representations as a search strategy: compression EDAs
Theoretical Computer Science - Foundations of genetic algorithms
Hi-index | 0.00 |
A powerful approach to search is to try to learn a distribution of good solutions (in particular of the dependencies between their variables) and use this distribution as a basis to sample new search points. Existing algorithms learn the search distribution directly on the given problem representation. We ask how search distributions can be modeled indirectly by a proper choice of factorial genetic code. For instance, instead of learning a direct probabilistic model of the dependencies between variables (like BOA does), one can alternatively learn a genetic representation of solutions on which these dependencies vanish. We consider questions like: Can every distribution be induced indirectly by a proper factorial representation? How can such representations be constructed from data? Are specific generative representations, like grammars or L-systems, universal w.r.t. inducing arbitrary distributions? We will consider latent variable probabilistic models as a framework to address such questions and thereby also establish relations to machine learning concepts like ICA.