Bayesian methods for adaptive models
Bayesian methods for adaptive models
Fast exact multiplication by the Hessian
Neural Computation
Natural gradient works efficiently in learning
Neural Computation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Numerical Initial Value Problems in Ordinary Differential Equations
Numerical Initial Value Problems in Ordinary Differential Equations
Bayesian Learning via Stochastic Dynamics
Advances in Neural Information Processing Systems 5, [NIPS Conference]
CHOMP: gradient optimization techniques for efficient motion planning
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Hi-index | 0.00 |
We propose a new Markov Chain Monte Carlo algorithm, which is a generalization of the stochastic dynamics method. The algorithm performs exploration of the state-space using its intrinsic geometric structure, which facilitates efficient sampling of complex distributions. Applied to Bayesian learning in neural networks, our algorithm was found to produce results comparable to the best state-of-the-art method while consuming considerably less time.