Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Comparison of approximate methods for handling hyperparameters
Neural Computation
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Statistics and Computing
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Sparse on-line Gaussian processes
Neural Computation
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
Pac-bayesian generalisation error bounds for gaussian process classification
The Journal of Machine Learning Research
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Gaussian Processes for Ordinal Regression
The Journal of Machine Learning Research
Gaussian Processes for Classification: Mean-Field Algorithms
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Variational Gaussian process classifiers
IEEE Transactions on Neural Networks
Bayesian Gaussian Process Classification with the EM-EP Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Unifying View of Sparse Approximate Gaussian Process Regression
The Journal of Machine Learning Research
Cybernetics and Systems Analysis
Classifying EEG for brain computer interfaces using Gaussian processes
Pattern Recognition Letters
Bayes Machines for binary classification
Pattern Recognition Letters
Fast Gaussian process methods for point process intensity estimation
Proceedings of the 25th international conference on Machine learning
Bayesian Inference and Optimal Design for the Sparse Linear Model
The Journal of Machine Learning Research
A sparse gaussian processes classification framework for fast tag suggestions
Proceedings of the 17th ACM conference on Information and knowledge management
Digital communication receivers using gaussian processes for machine learning
EURASIP Journal on Advances in Signal Processing
Validation-based sparse gaussian process classifier design
Neural Computation
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
Joint nonlinear channel equalization and soft LDPC decoding with Gaussian processes
IEEE Transactions on Signal Processing
Iterative node deployment in an unknown environment
GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
How to Explain Individual Classification Decisions
The Journal of Machine Learning Research
Approximate Marginals in Latent Gaussian Models
The Journal of Machine Learning Research
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
Collective inference for network data with copula latent markov networks
Proceedings of the sixth ACM international conference on Web search and data mining
Online learning with multiple kernels: A review
Neural Computation
Nested expectation propagation for Gaussian process classification
The Journal of Machine Learning Research
Gaussian Kullback-Leibler approximate inference
The Journal of Machine Learning Research
Perturbative corrections for approximate inference in Gaussian latent variable models
The Journal of Machine Learning Research
Structural and Multidisciplinary Optimization
Hi-index | 0.00 |
Gaussian process priors can be used to define flexible, probabilistic classification models. Unfortunately exact Bayesian inference is analytically intractable and various approximation techniques have been proposed. In this work we review and compare Laplace's method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model. We present a comprehensive comparison of the approximations, their predictive performance and marginal likelihood estimates to results obtained by MCMC sampling. We explain theoretically and corroborate empirically the advantages of Expectation Propagation compared to Laplace's method.