Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Regression with input-dependent noise: a Gaussian process treatment
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Statistics and Computing
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Expectation Consistent Approximate Inference
The Journal of Machine Learning Research
Vectorized adaptive quadrature in MATLAB
Journal of Computational and Applied Mathematics
Bayesian Inference and Optimal Design for the Sparse Linear Model
The Journal of Machine Learning Research
The variational gaussian approximation revisited
Neural Computation
Gaussian Processes for Machine Learning (GPML) Toolbox
The Journal of Machine Learning Research
Properties of Bethe free energies and message passing in Gaussian models
Journal of Artificial Intelligence Research
Expectation propagation for approximate inference in dynamic bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Expectation-propagation for the generative aspect model
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Variational Gaussian process classifiers
IEEE Transactions on Neural Networks
Gaussian process-based predictive modeling for bus ridership
Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
GPstuff: Bayesian modeling with Gaussian processes
The Journal of Machine Learning Research
Gaussian Kullback-Leibler approximate inference
The Journal of Machine Learning Research
Hi-index | 0.00 |
This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have been proposed. Expectation propagation (EP) has been found to be a very accurate method in many empirical studies but the convergence of EP is known to be problematic with models containing non-log-concave site functions. In this paper we illustrate the situations where standard EP fails to converge and review different modifications and alternative algorithms for improving the convergence. We demonstrate that convergence problems may occur during the type-II maximum a posteriori (MAP) estimation of the hyperparameters and show that standard EP may not converge in the MAP values with some difficult data sets. We present a robust implementation which relies primarily on parallel EP updates and uses a moment-matching-based double-loop algorithm with adaptively selected step size in difficult cases. The predictive performance of EP is compared with Laplace, variational Bayes, and Markov chain Monte Carlo approximations.