Numerical recipes in Pascal: the art of scientific computing
Numerical recipes in Pascal: the art of scientific computing
Bayesian methods for adaptive models
Bayesian methods for adaptive models
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
Evolving neural networks through augmenting topologies
Evolutionary Computation
Efficient Reinforcement Learning Through Evolving Neural Network Topologies
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Evaluation of gaussian processes and other methods for non-linear regression
Evaluation of gaussian processes and other methods for non-linear regression
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Efficient evolution of neural networks through complexification
Efficient evolution of neural networks through complexification
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
A comparison between cellular encoding and direct encoding for genetic neural networks
GECCO '96 Proceedings of the 1st annual conference on Genetic and evolutionary computation
Automatic gait optimization with Gaussian process regression
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Accelerating evolutionary algorithms with Gaussian process fitness function models
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Bayesian Monte Carlo for the Global Optimization of Expensive Functions
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
Variable risk control via stochastic optimization
International Journal of Robotics Research
Hi-index | 0.00 |
The task of finding the optimum of some function f (x) is commonly accomplished by generating and testing sample solutions iteratively, choosing each new sample x heuristically on the basis of results to date. We use Gaussian processes to represent predictions and uncertainty about the true function, and describe how to use these predictions to choose where to take each new sample in an optimal way. By doing this we were able to solve a difficult optimization problem - finding weights in a neural network controller to simultaneously balance two vertical poles - using an order of magnitude fewer samples than reported elsewhere.