Natural gradient works efficiently in learning
Neural Computation
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Machine learning of motor skills for robotics
Machine learning of motor skills for robotics
Neurocomputing
Stochastic search using the natural gradient
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Stochastic search using the natural gradient
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
The Journal of Machine Learning Research
Exponential natural evolution strategies
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
RGP: an open source genetic programming system for the R environment
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Bidirectional relation between CMA evolution strategies and natural evolution strategies
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
A natural evolution strategy for multi-objective optimization
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
High dimensions and heavy tails for natural evolution strategies
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Scalable neuroevolution for reinforcement learning
TPNC'12 Proceedings of the First international conference on Theory and Practice of Natural Computing
Hi-index | 0.00 |
Efficient Natural Evolution Strategies (eNES) is a novel alternative to conventional evolutionary algorithms, using the natural gradient to adapt the mutation distribution. Unlike previous methods based on natural gradients, eNES uses a fast algorithm to calculate the inverse of the exact Fisher information matrix, thus increasing both robustness and performance of its evolution gradient estimation, even in higher dimensions. Additional novel aspects of eNES include optimal fitness baselines and importance mixing (a procedure for updating the population with very few fitness evaluations). The algorithm yields competitive results on both unimodal and multimodal benchmarks.