A novel disambiguation method for unification-based grammars using probabilistic context-free approximations

  • Authors:
  • Bernd Kiefer;Hans-Ulrich Krieger;Detlef Prescher

  • Affiliations:
  • Language Technology Lab, Saarbrücken, Germany;Language Technology Lab, Saarbrücken, Germany;Language Technology Lab, Saarbrücken, Germany

  • Venue:
  • COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel disambiguation method for unification-based grammars (UBGs). In contrast to other methods, our approach obviates the need for probability models on the UBG side in that it shifts the responsibility to simpler context-free models, indirectly obtained from the UBG. Our approach has three advantages: (i) training can be effectively done in practice, (ii) parsing and disambiguation of context-free readings requires only cubic time, and (iii) involved probability distributions are mathematically clean. In an experiment for a mid-size UBG, we show that our novel approach is feasible. Using unsupervised training, we achieve 88% accuracy on an exact-match task.