Determinant Maximization with Linear Matrix Inequality Constraints
SIAM Journal on Matrix Analysis and Applications
An Introduction to Variational Methods for Graphical Models
Machine Learning
Mean field methods for classification with Gaussian processes
Proceedings of the 1998 conference on Advances in neural information processing systems II
Mean-field approaches to independent component analysis
Neural Computation
Approximate inference in Boltzmann machines
Artificial Intelligence
Expectation Propagation for approximate Bayesian inference
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Neural Computation
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
An approximate analytical approach to resampling averages
The Journal of Machine Learning Research
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Convex Optimization
Gaussian Processes for Classification: Mean-Field Algorithms
Neural Computation
Expectation propagation for approximate inference in dynamic bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Approximate inference and constrained optimization
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Bayesian independent component analysis: Variational methods and non-negative decompositions
Digital Signal Processing
Variational and stochastic inference for Bayesian source separation
Digital Signal Processing
Bayesian Inference and Optimal Design for the Sparse Linear Model
The Journal of Machine Learning Research
Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
Robust Gaussian Process Regression with a Student-t Likelihood
The Journal of Machine Learning Research
Nested expectation propagation for Gaussian process classification
The Journal of Machine Learning Research
Journal of Computational Physics
Gaussian Kullback-Leibler approximate inference
The Journal of Machine Learning Research
Perturbative corrections for approximate inference in Gaussian latent variable models
The Journal of Machine Learning Research
Hi-index | 0.01 |
We propose a novel framework for approximations to intractable probabilistic models which is based on a free energy formulation. The approximation can be understood as replacing an average over the original intractable distribution with a tractable one. It requires two tractable probability distributions which are made consistent on a set of moments and encode different features of the original intractable distribution. In this way we are able to use Gaussian approximations for models with discrete or bounded variables which allow us to include non-trivial correlations. These are neglected in many other methods. We test the framework on toy benchmark problems for binary variables on fully connected graphs and 2D grids and compare with other methods, such as loopy belief propagation. Good performance is already achieved by using single nodes as tractable substructures. Significant improvements are obtained when a spanning tree is used instead.