A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Learning in graphical models
Mean field methods for classification with Gaussian processes
Proceedings of the 1998 conference on Advances in neural information processing systems II
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Numerical Recipes in C: The Art of Scientific Computing
Numerical Recipes in C: The Art of Scientific Computing
Mean-field approaches to independent component analysis
Neural Computation
The Journal of Machine Learning Research
Pac-bayesian generalisation error bounds for gaussian process classification
The Journal of Machine Learning Research
An approximate analytical approach to resampling averages
The Journal of Machine Learning Research
Predictive automatic relevance determination by expectation propagation
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Gaussian process classification for segmenting and annotating sequences
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Neural Networks - 2005 Special issue: IJCNN 2005
Appearance-based gender classification with Gaussian processes
Pattern Recognition Letters
Assessing Approximate Inference for Binary Gaussian Process Classification
The Journal of Machine Learning Research
Expectation Consistent Approximate Inference
The Journal of Machine Learning Research
Bayesian independent component analysis: Variational methods and non-negative decompositions
Digital Signal Processing
Clustering Based on Gaussian Processes
Neural Computation
Bayes Machines for binary classification
Pattern Recognition Letters
Bayesian Inference and Optimal Design for the Sparse Linear Model
The Journal of Machine Learning Research
Cross-Validation Optimization for Large Scale Structured Classification Kernel Methods
The Journal of Machine Learning Research
Kernels, regularization and differential equations
Pattern Recognition
Bayesian Inference for Sparse Generalized Linear Models
ECML '07 Proceedings of the 18th European conference on Machine Learning
Outlier Robust Gaussian Process Classification
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
Validation-based sparse gaussian process classifier design
Neural Computation
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Approximate Marginals in Latent Gaussian Models
The Journal of Machine Learning Research
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Fast variational inference for gaussian process models through KL-Correction
ECML'06 Proceedings of the 17th European conference on Machine Learning
A case study on meta-generalising: a Gaussian processes approach
The Journal of Machine Learning Research
Perturbative corrections for approximate inference in Gaussian latent variable models
The Journal of Machine Learning Research
Hi-index | 0.00 |
We derive a mean-field algorithm for binary classification with gaussian processes that is based on the TAP approach originally proposed in statistical physics of disordered systems. The theory also yields an approximate leave-one-out estimator for the generalization error, which is computed with no extra computational cost. We show that from the TAP approach, it is possible to derive both a simpler "naive" mean-field theory and support vector machines (SVMs) as limiting cases. For both mean-field algorithms and support vector machines, simulation results for three small benchmark data sets are presented. They show that one may get state-of-the-art performance by using the leave-one-out estimator for model selection and the built-in leave-one-out estimators are extremely precise when compared to the exact leave-one-out estimate. The second result is taken as strong support for the internal consistency of the mean-field approach.