EM Learning for Symbolic-Statistical Models in Statistical Abduction

  • Authors:
  • Taisuke Sato

  • Affiliations:
  • -

  • Venue:
  • Progress in Discovery Science, Final Report of the Japanese Discovery Science Project
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We first review a logical-statistical framework called statistical abduction and identify its three computational tasks, one of which is the learning of parameters from observations by ML (maximum likelihood) estimation. Traditionally, in the presence of missing values, the EM algorithm has been used for ML estimation. We report that the graphical EM algorithm, a new EM algorithm developed for statistical abduction, achieved the same time complexity as specialized EM algorithms developed in each discipline such as the Inside-Outside algorithm for PCFGs (probabilistic context free grammars). Furthermore, learning experiments using two corpora revealed that it can outperform the Inside-Outside algorithm by orders of magnitude. We then specifically look into a family of extensions of PCFGs that incorporate context sensitiveness into PCFGs. Experiments show that they are learnable by the graphical EM algorithm using at most twice as much time as plain PCFGs even though these extensions have higher time complexity.