Free energy of stochastic context free grammar on variational bayes

  • Authors:
  • Tikara Hosino;Kazuho Watanabe;Sumio Watanabe

  • Affiliations:
  • Computational Intelligence and System Science, Tokyo Institute of Technology, Yokohama, Japan;Computational Intelligence and System Science, Tokyo Institute of Technology, Yokohama, Japan;Precision and Intelligence Laboratory, Tokyo Institute of Technology, Yokohama, Japan

  • Venue:
  • ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Variational Bayesian learning is proposed for approximation method of Bayesian learning. In spite of efficiency and experimental good performance, their mathematical property has not yet been clarified. In this paper we analyze variational Bayesian Stochastic Context Free Grammar which includes the true distribution thus the model is non-identifiable. We derive their asymptotic free energy. It is shown that in some prior conditions, the free energy is much smaller than identifiable models and satisfies eliminating redundant non-terminals.