Evidential reasoning using stochastic simulation of causal models
Artificial Intelligence
Markov random field modeling in image analysis
Markov random field modeling in image analysis
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Strategies in Scientific Computing
Monte Carlo Strategies in Scientific Computing
An empirical study of w-cutset sampling for bayesian networks
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Conditional random fields for multi-agent reinforcement learning
Proceedings of the 24th international conference on Machine learning
Learning to Recognize Objects with Little Supervision
International Journal of Computer Vision
Inferring a probability distribution function for the pose of a sensor network using a mobile robot
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Self-Avoiding Random Dynamics on Integer Complex Systems
ACM Transactions on Modeling and Computer Simulation (TOMACS) - Special Issue on Monte Carlo Methods in Statistics
Hi-index | 0.01 |
We present new MCMC algorithms for computing the posterior distributions and expectations of the unknown variables in undirected graphical models with regular structure. For demonstration purposes, we focus on Markov Random Fields (MRFs). By partitioning the MRFs into non-overlapping trees, it is possible to compute the posterior distribution of a particular tree exactly by conditioning on the remaining tree. These exact solutions allow us to construct efficient blocked and Rao-Blackwellised MCMC algorithms. We show empirically that tree sampling is considerably more efficient than other partitioned sampling schemes and the naive Gibbs sampler, even in cases where loopy belief propagation fails to converge. We prove that tree sampling exhibits lower variance than the naive Gibbs sampler and other naive partitioning schemes using the theoretical measure of maximal correlation. We also construct new information theory tools for comparing different MCMC schemes and show that, under these, tree sampling is more efficient.