Variational Bayes via propositionalized probability computation in PRISM

  • Authors:
  • Taisuke Sato;Yoshitaka Kameya;Kenichi Kurihara

  • Affiliations:
  • Tokyo Institute of Technology, Tokyo, Japan;Tokyo Institute of Technology, Tokyo, Japan;Google Japan Inc., Tokyo, Japan

  • Venue:
  • Annals of Mathematics and Artificial Intelligence
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

We propose a logic-based approach to variational Bayes (VB) via propositionalized probability computation in a symbolic-statistical modeling language PRISM. PRISM computes probabilities of logical formulas by reducing them to AND/OR boolean formulas called explanation graphs containing probabilistic ${\tt msw/2}$ atoms. We put Dirichlet priors on parameters of ${\tt msw/2}$ atoms and derive a variational Bayes EM algorithm that learns their hyper parameters from data. It runs on explanation graphs deduced from a program and a goal and computes probabilities in a dynamic programming manner in time linear in the size of the graphs. To show the genericity and effectiveness of Bayesian modeling by the proposed approach, we conducted two learning experiments, one with a probabilistic right-corner grammar and the other with a profile-HMM. To our knowledge, no previous report has been made of VB applied to these models.