Using Optimal Dependency-Trees for Combinational Optimization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Extending Selection Learning toward Fixed-Length d-Ary Strings
Selected Papers from the 5th European Conference on Artificial Evolution
Drift and Scaling in Estimation of Distribution Algorithms
Evolutionary Computation
Quantum-inspired evolutionary algorithm: a multimodel EDA
IEEE Transactions on Evolutionary Computation - Special issue on evolutionary algorithms based on probabilistic models
Hi-index | 0.00 |
The evolutionary algorithms that use probabilistic graphical models to represent properties of selected solutions are known as Distribution Estimation Algorithms (DEAs). Work on such algorithms has generally focused on the complexity of the models used. Here, the performance of two DEAs is investigated. One takes problem variables to be independent while the other uses pairwise conditional probabilities to generate a chain in which each variable conditions another. Three problems are considered that differ in the extent to which they impose a chain-like structure on variables. The more complex algorithm performs better on a function that exactly matches the structure of its model. However, on other problems, the selection mechanism is seen to be crucial, some previously reported gains for the more complex algorithm are shown to be unfounded and, with comparable mechanisms, the simpler algorithm gives better results. Some preliminary explanations of the dynamics of the algorithms are also offered.