Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
Dependency trees, permutations, and quadratic assignment problem
Proceedings of the 9th annual conference on Genetic and evolutionary computation
A fully multivariate DEUM algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Machine Learning: A Probabilistic Perspective
Machine Learning: A Probabilistic Perspective
Hi-index | 0.00 |
Estimation of Distribution Algorithms (EDAs) have been successfully applied to a wide variety of problems. The algorithmic model of EDA is generic and can virtually be used with any distribution model, ranging from the mere Bernoulli distribution to the sophisticated Bayesian network. The Hidden Markov Model (HMM) is a well-known graphical model useful for modelling populations of variable-length sequences of discrete values. Surprisingly, HMMs have not yet been used as distribution estimators for an EDA, even though it is a very powerful tool especially designed for modelling sequences. We thus propose a new method, called HMM-EDA, implementing this idea. Preliminary comparative results on two classical combinatorial optimization problems show that HMM-EDA is indeed a promising approach for problems that have sequential representations.