BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
A gradient BYY harmony learning algorithm on mixture of experts for curve detection
IDEAL'05 Proceedings of the 6th international conference on Intelligent Data Engineering and Automated Learning
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Entropy regularization, automatic model selection, and unsupervised image segmentation
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Hi-index | 0.00 |
In Gaussian mixture (GM) modeling, it is crucial to select the number of Gaussians for a sample data set. In this paper, we propose a gradient entropy regularized likelihood (ERL) algorithm on Gaussian mixture to solve this problem under regularization theory. It is demonstrated by the simulation experiments that the gradient ERL learning algorithm can select an appropriate number of Gaussians automatically during the parameter learning on a sample data set and lead to a good estimation of the parameters in the actual Gaussian mixture, even in the cases of two or more actual Gaussians overlapped strongly.