Elements of information theory
Elements of information theory
Statistical machine learning and combinatorial optimization
Theoretical aspects of evolutionary computing
A Survey of Optimization by Building and Using Probabilistic Models
Computational Optimization and Applications
Schemata, Distributions and Graphical Models in Evolutionary Optimization
Journal of Heuristics
Extending Population-Based Incremental Learning to Continuous Search Spaces
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Telephone Network Traffic Overloading Diagnosis and Evolutionary Computation Techniques
AE '97 Selected Papers from the Third European Conference on Artificial Evolution
Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
Bayesian optimization algorithm: from single level to hierarchy
Bayesian optimization algorithm: from single level to hierarchy
Information Sciences: an International Journal - Special issue: Evolutionary computation
On the importance of diversity maintenance in estimation of distribution algorithms
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Population-Based Continuous Optimization, Probabilistic Modelling and Mean Shift
Evolutionary Computation
The equation for response to selection and its use for prediction
Evolutionary Computation
Reinforcement learning estimation of distribution algorithm
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
Cross entropy and adaptive variance scaling in continuous EDA
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Designing EDAs by using the elitist convergent EDA concept and the boltzmann distribution
Proceedings of the 10th annual conference on Genetic and evolutionary computation
Approximating the search distribution to the selection distribution in EDAs
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
A Boltzmann based estimation of distribution algorithm
Information Sciences: an International Journal
Hi-index | 0.00 |
This paper extends the Boltzmann Selection, a method in EDA with theoretical importance, from discrete domain to the continuous one. The difficulty of estimating the exact Boltzmann distribution in continuous state space is circumvented by adopting the multivariate Gaussian model, which is popular in continuous EDA, to approximate only the final sampling distribution. With the minimum Kullback-Leibeler divergence principle, both the mean vector and the covariance matrix of the Gaussian model can be calibrated to preserve the features of Boltzmann selection reflecting desired selection pressure. A method is proposed to adapt the selection pressure based on measuring the successfulness of the past evolution process. These works established a formal basis that helps to build probabilistic models in continuous EDA algorithms with adaptive parameters. The framework is incorporated in both the continuous UMDA and the EMNA algorithm, and tested in several benchmark problems. The experiment results are compared with some existing EDA versions and the benefit of the proposed approach is discussed.