Notes on methods based on maximum-likelihood estimation for learning the parameters of the mixture of Gaussians model

  • Authors:
  • Luis E. Ortiz;Leslie Kaelbling

  • Affiliations:
  • -;-

  • Venue:
  • Notes on methods based on maximum-likelihood estimation for learning the parameters of the mixture of Gaussians model
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

In these notes, we present and review different methods based on maximum-likelihood estimation for learning the parameters of the mixture-of-Gaussians model. We describe a method based on the likelihood equations, traditional gradient-based methods (among them steepest ascent and gradient ascent), expectation-maximization (EM), conjugate gradient methods, and proposed accelerations of EM (among them Aitken acceleration or parameterized EM, and conjugate-gradient acceleration of EM). We describe all the methods in the context of the mixture-of-Gaussians model. We relate the methods presented through the generalized gradient-based formulation. None of the methods presented are theoretically dominant. We empirically analyze their performance on synthetic datasets and suggest cases in which particular methods perform better than others.