U-likelihood and U-updating algorithms: statistical inference in latent variable models

  • Authors:
  • Jaemo Sung;Sung-Yang Bang;Seungjin Choi;Zoubin Ghahramani

  • Affiliations:
  • Department of Computer Science, POSTECH, Republic of Korea;Department of Computer Science, POSTECH, Republic of Korea;Department of Computer Science, POSTECH, Republic of Korea;Gatsby Computational Neuroscience Unit, University College London, London, England

  • Venue:
  • ECML'05 Proceedings of the 16th European conference on Machine Learning
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we consider latent variable models and introduce a new $\mathcal{U}$-likelihood concept for estimating the distribution over hidden variables. One can derive an estimate of parameters from this distribution. Our approach differs from the Bayesian and Maximum Likelihood (ML) approaches. It gives an alternative to Bayesian inference when we don't want to define a prior over parameters and gives an alternative to the ML method when we want a better estimate of the distribution over hidden variables. As a practical implementation, we present a $\mathcal{U}$-updating algorithm based on the mean field theory to approximate the distribution over hidden variables from the $\mathcal{U}$-likelihood. This algorithm captures some of the correlations among hidden variables by estimating reaction terms. Those reaction terms are found to penalize the likelihood. We show that the $\mathcal{U}$-updating algorithm becomes the EM algorithm as a special case in the large sample limit. The useful behavior of our method is confirmed for the case of mixture of Gaussians by comparing to the EM algorithm.