Letters: An iterative algorithm for entropy regularized likelihood learning on Gaussian mixture with automatic model selection

  • Authors:
  • Zhiwu Lu

  • Affiliations:
  • Institute of Computer Science and Technology of Peking University, Beijing 100871, China

  • Venue:
  • Neurocomputing
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

As for Gaussian mixture modeling, the key problem is to select the number of Gaussians in the mixture. Based on regularization theory, we aim to make this kind of model selection by implementing an iterative algorithm for entropy regularized likelihood (ERL) learning on Gaussian mixture. The simulation experiments have demonstrated that the ERL algorithm can automatically detect the number of Gaussians with a good estimation of the parameters in the original mixture, even on a sample set with a high degree of overlap. Moreover, the ERL algorithm also leads to a promising result when applied to the classification of iris data.