SIAM Review
Factorial Hidden Markov Models
Machine Learning - Special issue on learning with probabilistic representations
An introduction to variational methods for graphical models
Learning in graphical models
Improved Approximation Algorithms for MAX k-CUT and MAX BISECTION
Proceedings of the 4th International IPCO Conference on Integer Programming and Combinatorial Optimization
Variational Approximations between Mean Field Theory and the Junction Tree Algorithm
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
A generalized mean field algorithm for variational inference in exponential families
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Cluster Selection Based on Coupling for Gaussian Mean Fields
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Journal of Artificial Intelligence Research
Variational upper bounds for probabilistic phylogenetic models
RECOMB'07 Proceedings of the 11th annual international conference on Research in computational molecular biology
Hi-index | 0.00 |
An autonomous variational inference algorithm for arbitrary graphical models requires the ability to optimize variational approximations over the space of model parameters as well as over the choice of tractable families used for the variational approximation. In this paper, we present a novel combination of graph partitioning algorithms with a generalized mean field (GMF) inference algorithm. This combination optimizes over disjoint clustering of variables and performs inference using those clusters. We provide a formal analysis of the relationship between the graph cut and the GMF approximation, and explore several graph partition strategies empirically. Our empirical results provide rather clear support for a weighted version of MinCut as a useful clustering algorithm for GMF inference, which is consistent with the implications from the formal analysis.