Fundamentals of statistical exponential families: with applications in statistical decision theory
Fundamentals of statistical exponential families: with applications in statistical decision theory
Information-based objective functions for active data selection
Neural Computation
Factorial Hidden Markov Models
Machine Learning - Special issue on learning with probabilistic representations
An Introduction to Variational Methods for Graphical Models
Machine Learning
Language Modeling for Information Retrieval
Language Modeling for Information Retrieval
The Journal of Machine Learning Research
ICML '06 Proceedings of the 23rd international conference on Machine learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
The matrix stick-breaking process for flexible multi-task learning
Proceedings of the 24th international conference on Machine learning
Convex multi-task feature learning
Machine Learning
Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
On smoothing and inference for topic models
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
A generalized mean field algorithm for variational inference in exponential families
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Communications of the ACM
Bayesian Reasoning and Machine Learning
Bayesian Reasoning and Machine Learning
Hi-index | 0.00 |
Mean-field variational methods are widely used for approximate posterior inference in many probabilistic models. In a typical application, mean-field methods approximately compute the posterior with a coordinate-ascent optimization algorithm. When the model is conditionally conjugate, the coordinate updates are easily derived and in closed form. However, many models of interest--like the correlated topic model and Bayesian logistic regression--are nonconjugate. In these models, mean-field methods cannot be directly applied and practitioners have had to develop variational algorithms on a case-by-case basis. In this paper, we develop two generic methods for nonconjugate models, Laplace variational inference and delta method variational inference. Our methods have several advantages: they allow for easily derived variational algorithms with a wide class of nonconjugate models; they extend and unify some of the existing algorithms that have been derived for specific models; and they work well on real-world data sets. We studied our methods on the correlated topic model, Bayesian logistic regression, and hierarchical Bayesian logistic regression.