Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Learning Bayesian networks with local structure
Learning in graphical models
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
A Survey of Optimization by Building and Using Probabilistic Models
Computational Optimization and Applications
Combining competent crossover and mutation operators: a probabilistic model building approach
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Linkage Problem, Distribution Estimation, and Bayesian Networks
Evolutionary Computation
Evaluation relaxation using substructural information and linear estimation
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Scalability problems of simple genetic algorithms
Evolutionary Computation
Effects of a deterministic hill climber on hBOA
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Analyzing probabilistic models in hierarchical BOA
IEEE Transactions on Evolutionary Computation - Special issue on evolutionary algorithms based on probabilistic models
A Bayesian approach to learning Bayesian networks with local structure
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Substructural neighborhoods for local search in the bayesian optimization algorithm
PPSN'06 Proceedings of the 9th international conference on Parallel Problem Solving from Nature
On the Scalability of Real-Coded Bayesian Optimization Algorithm
IEEE Transactions on Evolutionary Computation
Entropy-based efficiency enhancement techniques for evolutionary algorithms
Information Sciences: an International Journal
Hi-index | 0.00 |
A customary paradigm of designing a competent optimization algorithm is to combine an effective global searcher with an efficient local searcher. This paper presents and analyzes an entropy-based substructural local search method (eSLS) for the Bayesian Optimization Algorithm (BOA). The local searcher (the mutation operator) explores the substructural neighborhood areas defined by the probabilistic model encoded in the Bayesian network. The improvement of each local search step can be estimated by considering the variation this mutation causes to the entropy measurement of the population. Experiments show that incorporating BOA with eSLS results in a substantial reduction in the number of costly fitness evaluations until convergence. Moreover, this paper provides original insights into how the randomness of populations can be exploited to enhance the performance of optimization processes.