A fast algorithm for particle simulations
Journal of Computational Physics
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
A multigrid tutorial: second edition
A multigrid tutorial: second edition
Convex Optimization
Convex optimization techniques for fitting sparse Gaussian graphical models
ICML '06 Proceedings of the 23rd international conference on Machine learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Multiscale Gaussian Graphical Models and Algorithms for Large-Scale Inference
SSP '07 Proceedings of the 2007 IEEE/SP 14th Workshop on Statistical Signal Processing
Exploiting sparse Markov and covariance structure in multiresolution models
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Loopy belief propagation as a basis for communication in sensor networks
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Robust Distributed Estimation Using the Embedded Subgraphs Algorithm
IEEE Transactions on Signal Processing
Wavelet-based statistical signal processing using hidden Markovmodels
IEEE Transactions on Signal Processing
ML parameter estimation of a multiscale stochastic process usingthe EM algorithm
IEEE Transactions on Signal Processing
Embedded trees: estimation of Gaussian Processes on graphs with cycles
IEEE Transactions on Signal Processing
Estimation in Gaussian Graphical Models Using Tractable Subgraphs: A Walk-Sum Analysis
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
Segmentation of textured images using a multiresolution Gaussian autoregressive model
IEEE Transactions on Image Processing
Hi-index | 35.68 |
In this paper, we consider the problem of learning Gaussian multiresolution (MR) models in which data are only available at the finest scale, and the coarser, hidden variables serve to capture long-distance dependencies. Tree-structured MR models have limited modeling capabilities, as variables at one scale are forced to be uncorrelated with each other conditioned on other scales. We propose a new class of Gaussian MR models in which variables at each scale have sparse conditional covariance structure conditioned on other scales. Our goal is to learn a tree-structured graphical model connecting variables across scales (which translates into sparsity in inverse covariance), while at the same time learning sparse structure for the conditional covariance (not its inverse) within each scale conditioned on other scales. This model leads to an efficient, new inference algorithm that is similar to multipole methods in computational physics. We demonstrate the modeling and inference advantages of our approach over methods that use MR tree models and single-scale approximation methods that do not use hidden variables.