Probabilistic CFG with latent annotations

  • Authors:
  • Takuya Matsuzaki;Yusuke Miyao;Jun'ichi Tsujii

  • Affiliations:
  • University of Tokyo, Hongo, Bunkyo-ku, Tokyo;University of Tokyo, Hongo, Bunkyo-ku, Tokyo;University of Tokyo, Hongo, Bunkyo-ku, Tokyo and CREST, JST(Japan Science and Technology Agency), Honcho, Kawaguchi-shi, Saitama

  • Venue:
  • ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper defines a generative probabilistic model of parse trees, which we call PCFG-LA. This model is an extension of PCFG in which non-terminal symbols are augmented with latent variables. Fine-grained CFG rules are automatically induced from a parsed corpus by training a PCFG-LA model using an EM-algorithm. Because exact parsing with a PCFG-LA is NP-hard, several approximations are described and empirically compared. In experiments using the Penn WSJ corpus, our automatically trained model gave a performance of 86.6% (F1, sentences ≤ 40 words), which is comparable to that of an unlexicalized PCFG parser created using extensive manual feature selection.