Modeling and Segmentation of Noisy and Textured Images Using Gibbs Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Polynomial-time approximation algorithms for the Ising model
SIAM Journal on Computing
Minimax entropy principle and its application to texture modeling
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
EMMCVPR '97 Proceedings of the First International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition
Minimax Entropy and Learning by Diffusion
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Equivalence of Julesz and Gibbs Texture Ensembles
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
Stochastic approximation algorithms for partition function estimation of Gibbs random fields
IEEE Transactions on Information Theory
Herding dynamical weights to learn
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Herding dynamic weights for partially observed random field models
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Stochastic Composite Likelihood
The Journal of Machine Learning Research
An efficient approach to learning inhomogeneous gibbs model
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.14 |
Gibbsian fields or Markov random fields are widely used in Bayesian image analysis, but learning Gibbs models is computationally expensive. The computational complexity is pronounced by the recent minimax entropy (FRAME) models which use large neighborhoods and hundreds of parameters. In this paper, we present a common framework for learning Gibbs models. We identify two key factors that determine the accuracy and speed of learning Gibbs models: The efficiency of likelihood functions and the variance in approximating partition functions using Monte Carlo integration. We propose three new algorithms. In particular, we are interested in a maximum satellite likelihood estimator, which makes use of a set of precomputed Gibbs models called "satellites" to approximate likelihood functions. This algorithm can approximately estimate the minimax entropy model for textures in seconds in a HP workstation. The performances of various learning algorithms are compared in our experiments.