Multiobjective hBOA, clustering, and scalability
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Comparison of Multiobjective Evolutionary Algorithms: Empirical Results
Evolutionary Computation
Population-Based Continuous Optimization, Probabilistic Modelling and Mean Shift
Evolutionary Computation
Local search for multiobjective function optimization: pareto descent method
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Covariance Matrix Adaptation for Multi-objective Optimization
Evolutionary Computation
Adaptive variance scaling in continuous multi-objective estimation-of-distribution algorithms
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Elitist archiving for multi-objective evolutionary algorithms: to adapt or not to adapt
PPSN'12 Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part II
On the taxonomy of optimization problems under estimation of distribution algorithms
Evolutionary Computation
Hi-index | 0.00 |
It is known that in real-valued Single-Objective (SO) optimization with Gaussian Estimation-of-Distribution Algorithms (EDAs), it is important to take into account how distribution parameters change in subsequent generations to prevent inefficient convergence as a result of overfitting, especially if dependencies are modelled. We illustrate that in Multi-Objective (MO) optimization the risk of overfitting is even larger and only further increased if clustered variation is used, a technique often employed in Multi-Objective EDAs (MOEDAs) in the form of mixture modelling via clustering selected solutions in objective space. We point out that a technique previously used in EDAs to remove the risk of overfitting for SO optimization, the anticipated mean shift (AMS), can also be used in MO optimization if clusters in subsequent generations are registered. We propose to compute this registration explicitly. Although computationally more intensive that existing approaches, the effectiveness of AMS is thereby increased. We further propose a new clustering technique to improve mixture modelling in EDAs by 1) allowing clusters to overlap substantially and 2) assigning each cluster the same number of solutions. This allows any existing EDA to be transformed into a mixture-based version straightforwardly. Finally, we point out the benefit of injecting solutions obtained from running equal-capacity SO optimizers in synchronous parallel and investigate experimentally, using 9 well-known benchmark problems, the advantages of each of the techniques.