Stochastic complexity for mixture of exponential families in generalized variational Bayes

  • Authors:
  • Kazuho Watanabe;Sumio Watanabe

  • Affiliations:
  • Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Japan;P&I Lab., Tokyo Institute of Technology, Japan

  • Venue:
  • Theoretical Computer Science
  • Year:
  • 2007

Quantified Score

Hi-index 5.23

Visualization

Abstract

The Variational Bayesian learning, proposed as an approximation of the Bayesian learning, has provided computational tractability and good generalization performance in many applications. However, little has been done to investigate its theoretical properties. In this paper, we discuss the Variational Bayesian learning of the mixture of exponential families and derive the asymptotic form of the stochastic complexities in a generalized setting of the prior distribution. We show that the stochastic complexities become smaller than those of regular statistical models, which implies that the advantage of the Bayesian learning still remains in the Variational Bayesian learning. Stochastic complexity, which is called the marginal likelihood or the free energy, not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayesian approach as an approximation of the true Bayesian learning. The main result also shows the effects of the prior distribution under the generalized setting.