Elements of information theory
Elements of information theory
A Greedy EM Algorithm for Gaussian Mixture Learning
Neural Processing Letters
EMMCVPR '99 Proceedings of the Second International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition
Learning a multivariate Gaussian mixture model with the reversible jump MCMC algorithm
Statistics and Computing
Multivariate mixtures of normals with unknown number of components
Statistics and Computing
SMEM Algorithm for Mixture Models
Neural Computation
A kurtosis-based dynamic approach to Gaussian mixture modeling
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Asymptotic theory of greedy approximations to minimal k-point random graphs
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (Expectation-Maximization) algorithm yields the maximum-likelihood solution it requires a careful initialization of the parameters and the optimal number of kernels in the mixture may be unknown beforehand. We propose a criterion based on the entropy of the pdf (probability density function) associated to each kernel to measure the quality of a given mixture model. Two different methods for estimating Shannon entropy are proposed and a modification of the classical EM algorithm to find the optimal number of kernels in the mixture is presented. We test our algorithm in probability density estimation, pattern recognition and color image segmentation.