BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
Hi-index | 0.00 |
In finite mixture modelling, it is crucial to select the number of components for a data set. We have proposed an entropy regularized likelihood (ERL) learning principle for the finite mixtures to solve this model selection problem under regularization theory. In this paper, we further give an asymptotic analysis of the ERL learning, and find that the global minimization of the ERL function in a simulated annealing way (i.e., the regularization factor is gradually reduced to zero) leads to automatic model selection on the finite mixtures with a good parameter estimation. As compared with the EM algorithm, the ERL learning can go across the local minima of the negative likelihood and keep robust with respect to initialization. The simulation experiments then prove our theoretic analysis.