Random swap EM algorithm for Gaussian mixture models

  • Authors:
  • Qinpei Zhao;Ville HautamäKi;Ismo KäRkkäInen;Pasi FräNti

  • Affiliations:
  • School of Computing, University of Eastern Finland, FI-80101, Finland;School of Computing, University of Eastern Finland, FI-80101, Finland;School of Computing, University of Eastern Finland, FI-80101, Finland;School of Computing, University of Eastern Finland, FI-80101, Finland

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2012

Quantified Score

Hi-index 0.10

Visualization

Abstract

Expectation maximization (EM) algorithm is a popular way to estimate the parameters of Gaussian mixture models. Unfortunately, its performance highly depends on the initialization. We propose a random swap EM for the initialization of EM. Instead of starting from a completely new solution in each repeat as in repeated EM, we make a random perturbation on the solution before continuing EM iterations. The removal and addition in random swap are simpler and more natural than split and merge or crossover and mutation operations. The most important benefit of random swap is its simplicity and efficiency. RSEM needs only the number of swaps as a parameter in contrast to complicated parameter-setting in genetic-based EM. We show by experiments that the proposed algorithm is 9-63% faster in computation time compared to the repeated EM, 20-83% faster than split and merge EM except in one case. RSEM is much faster but has lower log-likelihood than GAEM for synthetic data with a certain parameter setting. The proposed algorithm also reaches comparable result in terms of log-likelihood.