Numerical continuation methods: an introduction
Numerical continuation methods: an introduction
Automatic differentiation of algorithms: from simulation to optimization
Automatic differentiation of algorithms: from simulation to optimization
Stochastic method for the solution of unconstrained vector optimization problems
Journal of Optimization Theory and Applications
Exploiting gradient information in numerical multi--objective evolutionary optimization
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Combining gradient techniques for numerical multi-objective evolutionary optimization
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Local search for multiobjective function optimization: pareto descent method
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Gradient Based Stochastic Mutation Operators in Evolutionary Multi-objective Optimization
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part I
Evolutionary continuation methods for optimization problems
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Using gradient-based information to deal with scalability in multi-objective evolutionary algorithms
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Computing gap free pareto front approximations with stochastic search algorithms
Evolutionary Computation
On gradient based local search methods in unconstrained evolutionary multi-objective optimization
EMO'07 Proceedings of the 4th international conference on Evolutionary multi-criterion optimization
HM'07 Proceedings of the 4th international conference on Hybrid metaheuristics
HCS: a new local search strategy for memetic multiobjective evolutionary algorithms
IEEE Transactions on Evolutionary Computation
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Hi-index | 0.01 |
The goal of this research is to study the incorporation of gradient-based information when designing Multi-objective Evolutionary Algorithms (MOEAs). We analyze the benefits, and challenges, of using these well developed mathematical programming techniques in order to get hybrid MOEAs. Since we expect the new hybrid algorithms to search effectively and more efficiently than currently available MOEAs, a deeper study of the balance between the computational and the benefits of this coupling is highly necessary.