The nature of statistical learning theory
The nature of statistical learning theory
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian parameter estimation via variational methods
Statistics and Computing
Sparse on-line Gaussian processes
Neural Computation
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Expectation Propagation for approximate Bayesian inference
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
The Journal of Machine Learning Research
Gaussian Processes for Classification: Mean-Field Algorithms
Neural Computation
Neural Computation
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Bayesian Gaussian Process Classification with the EM-EP Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Assessing Approximate Inference for Binary Gaussian Process Classification
The Journal of Machine Learning Research
Expectation propagation for approximate inference in dynamic bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Variational Gaussian process classifiers
IEEE Transactions on Neural Networks
Sparse Bayes Machines for Binary Classification
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Network-based sparse Bayesian classification
Pattern Recognition
Hi-index | 0.10 |
In this work, we propose an approach to binary classification based on an extension of Bayes Point Machines. Particularly, we take into account the whole set of hypotheses that are consistent with the data (the so-called version space) and the intrinsic noise in class labeling. We follow a Bayesian approach and compute an approximate posterior distribution for the model parameters, which leads to a predictive distribution over unseen data. The most compelling feature of the proposed model is that it is able to learn the noise present in the data with no additional cost. All the computations are carried out by means of the approximate Bayesian inference algorithm Expectation Propagation. Experimental results indicate that the proposed approach outperforms Support Vector Machines over several of the classification problems studied and is competitive with other Bayesian classification algorithms based on Gaussian Processes.