Playing billiards in version space
Neural Computation
Bayesian parameter estimation via variational methods
Statistics and Computing
A comparison of scientific and engineering criteria for Bayesian modelselection
Statistics and Computing
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
The Journal of Machine Learning Research
Learning iteratively a classifier with the Bayesian Model Averaging Principle
Pattern Recognition
Compact approximations of mixture distributions for state estimation in multiagent settings
Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 2
Patch Learning for Incremental Classifier Design
Proceedings of the 2006 conference on ECAI 2006: 17th European Conference on Artificial Intelligence August 29 -- September 1, 2006, Riva del Garda, Italy
Nested expectation propagation for Gaussian process classification
The Journal of Machine Learning Research
Hi-index | 0.00 |
We provide a general framework for learning precise, compact, and fast representations of the Bayesian predictive distribution for a model. This framework is based on minimizing the KL divergence between the true predictive density and a suitable compact approximation. We consider various methods for doing this, both sampling based approximations, and deterministic approximations such as expectation propagation. These methods are tested on a mixture of Gaussians model for density estimation and on binary linear classification, with both synthetic data sets for visualization and several real data sets. Our results show significant reductions in prediction time and memory footprint.