Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Schemata, Distributions and Graphical Models in Evolutionary Optimization
Journal of Heuristics
Extending Population-Based Incremental Learning to Continuous Search Spaces
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Population-Based Continuous Optimization, Probabilistic Modelling and Mean Shift
Evolutionary Computation
Probabilistic modeling for continuous EDA with Boltzmann selection and Kullback-Leibeler divergence
Proceedings of the 8th annual conference on Genetic and evolutionary computation
The correlation-triggered adaptive variance scaling IDEA
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Convergence phases, variance trajectories, and runtime analysis of continuous EDAs
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Unified eigen analysis on multivariate Gaussian based estimation of distribution algorithms
Information Sciences: an International Journal
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
Dependence trees with copula selection for continuous estimation of distribution algorithms
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Entropy-based efficiency enhancement techniques for evolutionary algorithms
Information Sciences: an International Journal
On the convergence of a class of estimation of distribution algorithms
IEEE Transactions on Evolutionary Computation
Hi-index | 0.07 |
This paper introduces a new approach for estimation of distribution algorithms called the Boltzmann Univariate Marginal Distribution Algorithm (BUMDA). It uses a Normal-Gaussian model to approximate the Boltzmann distribution, hence, formulae for computing the mean and variance parameters of the Gaussian model are derived from the analytical minimization of the Kullback-Leibler divergence. The resulting formulae explicitly introduces information about the fitness landscape for the Gaussian parameters computation, in consequence, the Gaussian distribution obtains a better bias to sample intensively the most promising regions than simply using the maximum likelihood estimator of the selected set. In addition, the BUMDA formulae needs only one user parameter. Accordingly to the experimental results, the BUMDA excels in its niche of application. We provide theoretical, graphical and statistical analysis to show the BUMDA performance contrasted with state of the art EDAs.