Elements of information theory
Elements of information theory
Statistical physics, mixtures of distributions, and the EM algorithm
Neural Computation
IEEE Transactions on Neural Networks
Supervised learning of Gaussian mixture models for visual vocabulary generation
Pattern Recognition
Hi-index | 0.00 |
In this paper, a fast globally supervised learning algorithm for Gaussian Mixture Models based on the maximum relative entropy (MRE) is proposed. To reduce the computation complexity in Gaussian component probability densities, the concept of quasi-Gaussian probability density is used to compute the simplified probabilities. For four different learning algorithms such as the maximum mutual information algorithm (MMI), the maximum likelihood estimation (MLE), the generalized probabilistic descent (GPD) and the maximum relative entropy (MRE) algorithm, the random experiment approach is used to evaluate their performances. The experimental results show that the MRE is a better alternative algorithm in accuracy and training speed compared with GPD, MMI and MLE.