Fundamentals of statistical exponential families: with applications in statistical decision theory
Fundamentals of statistical exponential families: with applications in statistical decision theory
Multichannel Texture Analysis Using Localized Spatial Filters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sampling and integration of near log-concave functions
STOC '91 Proceedings of the twenty-third annual ACM symposium on Theory of computing
The Design and Use of Steerable Filters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Handbook of pattern recognition & computer vision
Texture Features for Browsing and Retrieval of Image Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Filtering for Texture Classification: A Comparative Study
IEEE Transactions on Pattern Analysis and Machine Intelligence
An application of reversible-jump MCMC to multivariate spherical Gaussian mixtures
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
The Earth Mover's Distance as a Metric for Image Retrieval
International Journal of Computer Vision
Mixture density estimation with group membership functions
Pattern Recognition Letters
On Bayesian analyses and finite mixtures for proportions
Statistics and Computing
On Bayesian model and variable selection using MCMC
Statistics and Computing
Texture Classification by Wavelet Packet Signatures
IEEE Transactions on Pattern Analysis and Machine Intelligence
Texture Segmentation using 2-D Gabor Elementary Functions
IEEE Transactions on Pattern Analysis and Machine Intelligence
The steerable pyramid: a flexible architecture for multi-scale derivative computation
ICIP '95 Proceedings of the 1995 International Conference on Image Processing (Vol. 3)-Volume 3 - Volume 3
Learning a multivariate Gaussian mixture model with the reversible jump MCMC algorithm
Statistics and Computing
Multivariate mixtures of normals with unknown number of components
Statistics and Computing
Practical Bayesian estimation of a finite beta mixture through gibbs sampling and its applications
Statistics and Computing
Bayesian finite mixtures with an unknown number of components: The allocation sampler
Statistics and Computing
Bayesian multivariate Poisson mixtures with an unknown number of components
Statistics and Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Joint Bayesian model selection and estimation of noisy sinusoidsvia reversible jump MCMC
IEEE Transactions on Signal Processing
The Uncertainty Principle in Image Processing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical texture characterization from discrete wavelet representations
IEEE Transactions on Image Processing
Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance
IEEE Transactions on Image Processing
Comparison of texture features based on Gabor filters
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
Texture analysis and classification with tree-structured wavelet transform
IEEE Transactions on Image Processing
Optimal Gabor filters for texture segmentation
IEEE Transactions on Image Processing
Texture classification and segmentation using wavelet frames
IEEE Transactions on Image Processing
Journal of Visual Communication and Image Representation
Hi-index | 12.05 |
The use of mixture models in image and signal processing has proved to be of considerable interest in terms of both theoretical development and in their usefulness in several applications. Researchers have approached the mixture estimation and selection problem, to model complex datasets, with different techniques in the last few years. In theory, it is well-known that full Bayesian approaches, to handle this problem, are fully optimal. The Bayesian learning allows the incorporation of prior knowledge in a formal coherent way that avoids overfitting problems. In this paper, we propose a fully Bayesian approach for finite Beta mixtures learning using a reversible jump Markov chain Monte Carlo (RJMCMC) technique which simultaneously allows cluster assignments, parameters estimation, and the selection of the optimal number of clusters. The adverb ''fully'' is justified by the fact that all parameters of interest in our model including number of clusters and missing values are considered as random variables for which priors are specified and posteriors are approximated using RJMCMC. Our work is motivated by the fact that Beta mixtures are able to fit any unknown distributional shape and then can be considered as a useful class of flexible models to address several problems and applications involving measurements and features having well-known marked deviation from the Gaussian shape. The usefulness of the proposed approach is confirmed using synthetic mixture data, real data, and through an interesting application namely texture classification and retrieval.