Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Extending Population-Based Incremental Learning to Continuous Search Spaces
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Removing the Genetics from the Standard Genetic Algorithm
Removing the Genetics from the Standard Genetic Algorithm
Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications (Studies in Computational Intelligence)
Towards billion-bit optimization via a parallel estimation of distribution algorithm
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Cross entropy and adaptive variance scaling in continuous EDA
Proceedings of the 9th annual conference on Genetic and evolutionary computation
An overview of evolutionary algorithms for parameter optimization
Evolutionary Computation
The equation for response to selection and its use for prediction
Evolutionary Computation
Initial-population bias in the univariate estimation of distribution algorithm
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Variance scaling for EDAs revisited
KI'11 Proceedings of the 34th Annual German conference on Advances in artificial intelligence
IEEE Transactions on Evolutionary Computation
Real-Valued Compact Genetic Algorithms for Embedded Microcontroller Optimization
IEEE Transactions on Evolutionary Computation
Goldenberry: EDA visual programming in orange
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
This paper considers large-scale OneMax and RoyalRoad problems with up to 107 binary variables within a compact Estimation of Distribution Algorithms (EDA) framework. Building upon the compact Genetic Algorithm (cGA), the continuous domain Population-Based Incremental Learning algorithm (PBILc) and the arithmetic-coding EDA, we define a novel method that is able to compactly solve regular and noisy versions of these problems with minimal memory requirements, regardless of problem or population size. This feature allows the algorithm to be run in a conventional desktop machine. Issues regarding probability model sampling, arbitrary precision of the arithmetic-coding decompressing scheme, incremental fitness function evaluation and updating rules for compact learning, are presented and discussed.