Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
A Survey of Optimization by Building and Using Probabilistic Models
Computational Optimization and Applications
Proceedings of the 5th International Conference on Genetic Algorithms
From Recombination of Genes to the Estimation of Distributions I. Binary Parameters
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
A Note on the Extended Rosenbrock Function
Evolutionary Computation
The correlation-triggered adaptive variance scaling IDEA
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Towards a New Evolutionary Computation: Advances on Estimation of Distribution Algorithms (Studies in Fuzziness and Soft Computing)
Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications (Studies in Computational Intelligence)
Unified eigen analysis on multivariate Gaussian based estimation of distribution algorithms
Information Sciences: an International Journal
Enhancing the Performance of Maximum---Likelihood Gaussian EDAs Using Anticipated Mean Shift
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
Preventing Premature Convergence in a Simple EDA Via Global Step Size Setting
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
Stochastic Local Search Techniques with Unimodal Continuous Distributions: A Survey
EvoWorkshops '09 Proceedings of the EvoWorkshops 2009 on Applications of Evolutionary Computing: EvoCOMNET, EvoENVIRONMENT, EvoFIN, EvoGAMES, EvoHOT, EvoIASP, EvoINTERACTION, EvoMUSART, EvoNUM, EvoSTOC, EvoTRANSLOG
Approximating the search distribution to the selection distribution in EDAs
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
BBOB-benchmarking a simple estimation of distribution algorithm with cauchy distribution
Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers
Truncation selection and Gaussian EDA: bounds for sustainable progress in high-dimensional spaces
Evo'08 Proceedings of the 2008 conference on Applications of evolutionary computing
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Benchmarking parameter-free amalgam on functions with and without noise
Evolutionary Computation
Hi-index | 0.00 |
Recently, advances have been made in continuous, normal-distribution-based Estimation-of-DistributionAlgorithms (EDAs) by scaling the variance upfrom the maximum-likelihood estimate. When doneproperly, such scaling has been shown to preventpremature convergence on slope-like regions ofthe search space. In this paper we specificallyfocus on one way of scaling that was previouslyintroduced as Adaptive Variance Scaling (AVS). It wasfound that when using AVS, the average number offitness evaluations grows subquadratically withthe dimensionality on a wide range of unimodaltest-problems, competitively with the CMA-ES.Still, room for improvement exists because thevariance doesn't always have to be scaled. Apreviously introduced trigger based on correlationthat determines when to apply scaling was shownto fail on higher dimensional problems. Here weprovide a new solution called the Standard-DeviationRatio (SDR) trigger that is integrated with theIterated Density-Estimation Evolutionary Algorithm(IDEA). Intuitively put, scaling istriggered with SDR only if improvements are foundto be far away from the mean. SDR works even inhigh dimensions as a result of factorizing thedecision rule behind the trigger according to theestimated Bayesian factorization. We evaluateSDR-AVS-IDEA on the same set ofbenchmark problems and compare it with AVS-IDEAand CMA-ES. We find that the addition of SDR givesAVS-IDEA an important extra edgefor it to be used in future research and inapplications both in single-objective optimizationas well as in multi-objective and dynamicoptimization. In addition, we provide practical rulesof thumb for parameter settings for usingSDR-AVS-IDEA that result in anasymptotic scale-up behavior that is sublinearfor the population size (O(l^{0.85})) andsubquadratic (O(l^{1.85})) for thenumber of evaluations.