Efficient greedy learning of Gaussian mixture models
Neural Computation
BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures
Neural Computation
Neural Processing Letters
A fast fixed-point BYY harmony learning algorithm on Gaussian mixture with automated model selection
Pattern Recognition Letters
On the correct convergence of the EM algorithm for Gaussian mixtures
Pattern Recognition
A stage by stage pruning algorithm for detecting the number of clusters in a dataset
ICIC'10 Proceedings of the 6th international conference on Advanced intelligent computing theories and applications: intelligent computing
Hi-index | 0.00 |
Gaussian mixture is a powerful statistic tool and has been widely used in the fields of information processing and data analysis. However, its model selection, i.e., the selection of number of Gaussians in the mixture, is still a difficult problem. Fortunately, the new established Bayesian Ying-Yang (BYY) harmony function becomes an efficient criterion for model selection on the Gaussian mixture modeling. In this paper, we propose a BYY split-and-merge EM algorithm for Gaussian mixture to maximize the BYY harmony function by splitting or merging the unsuited Gaussians in the estimated mixture obtained from the EM algorithm in each time dynamically. It is demonstrated well by the experiments that this BYY split-and-merge EM algorithm can make both model selection and parameter estimation efficiently for the Gaussian mixture modeling.