Simplifying Mixture Models Using the Unscented Transform

  • Authors:
  • Jacob Goldberger;Hayit K. Greenspan;Jeremie Dreyfuss

  • Affiliations:
  • Bar-Ilan University, Ramt-Gan;Tel-Aviv University, Tel-Aviv;Tel-Aviv University, Tel-Aviv

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2008

Quantified Score

Hi-index 0.14

Visualization

Abstract

Mixture of Gaussians (MoG) model is a useful tool in statistical learning. In many learning processes that are based on mixture models, computational requirements are very demanding due to the large number of components involved in the model. We propose a novel algorithm for learning a simplified representation of a Gaussian mixture, that is based on the Unscented Transform which was introduced for filtering nonlinear dynamical systems. The superiority of the proposed method is validated on both simulation experiments and categorization of a real image database. The proposed categorization methodology is based on modeling each image using a Gaussian mixture model. A category model is obtained by learning a simplified mixture model from all the images in the category.