A gradient-based algorithm competitive with variational Bayesian EM for mixture of Gaussians

  • Authors:
  • Mikael Kuusela;Tapani Raiko;Antti Honkela;Juha Karhunen

  • Affiliations:
  • Adaptive Informatics Research Center, Helsinki University of Technology, Helsinki, Finland;Adaptive Informatics Research Center, Helsinki University of Technology, Helsinki, Finland;Adaptive Informatics Research Center, Helsinki University of Technology, Helsinki, Finland;Adaptive Informatics Research Center, Helsinki University of Technology (TKK), Helsinki, Finland

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

While variational Bayesian (VB) inference is typically done with the so called VB EM algorithm, there are models where it cannot be applied because either the E-step or the M-step cannot be solved analytically. In 2007, Honkela et al. introduced a recipe for a gradient-based algorithm for VB inference that does not have such a restriction. In this paper, we derive the algorithm in the case of the mixture of Gaussians model. For the first time, the algorithm is experimentally compared to VB EM and its variant with both artificial and real data. We conclude that the algorithms are approximately as fast depending on the problem.