The Racing Algorithm: Model Selection for Lazy Learners
Artificial Intelligence Review - Special issue on lazy learning
Covariance Matrix Adaptation for Multi-objective Optimization
Evolutionary Computation
Computational Geometry: Algorithms and Applications
Computational Geometry: Algorithms and Applications
The Journal of Machine Learning Research
Tuning Bandit Algorithms in Stochastic Environments
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Hoeffding and Bernstein races for selecting policies in evolutionary direct policy search
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Pareto-dominance in noisy environments
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Improved step size adaptation for the MO-CMA-ES
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Multiobjective evolutionary algorithm for the optimization of noisy combustion processes
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
A review of multiobjective test problems and a scalable test problem toolkit
IEEE Transactions on Evolutionary Computation
A non-parametric statistical dominance operator for noisy multiobjective optimization
SEAL'12 Proceedings of the 9th international conference on Simulated Evolution and Learning
Hi-index | 0.00 |
Sincemany real-world optimization problems are noisy, vector optimization algorithms that can cope with noise and uncertainty are required. We propose new, robust selection strategies for evolutionarymultiobjective optimization in the presence of noise.We apply new measures of uncertainty for estimating the recently introduced Pareto-dominance for uncertain and noisy environments (PDU). The first measure is the interquartile range of the outcomes of repeated function evaluations. The second is based on axis-aligned bounding boxes around the upper and lower quantiles of the sampled fitness values in objective space. Experiments on real and artificial problems show promising results.