Incremental, predictive parsing with psycholinguistically motivated tree-adjoining grammar

  • Authors:
  • Vera Demberg;Frank Keller;Alexander Koller

  • Affiliations:
  • Saarland University;University of Edinburgh;University of Potsdam

  • Venue:
  • Computational Linguistics
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Psycholinguistic research shows that key properties of the human sentence processor are incrementality, connectedness partial structures contain no unattached nodes, and prediction upcoming syntactic structure is anticipated. There is currently no broad-coverage parsing model with these properties, however. In this article, we present the first broad-coverage probabilistic parser for PLTAG, a variant of TAG that supports all three requirements. We train our parser on a TAG-transformed version of the Penn Treebank and show that it achieves performance comparable to existing TAG parsers that are incremental but not predictive. We also use our PLTAG model to predict human reading times, demonstrating a better fit on the Dundee eye-tracking corpus than a standard surprisal model.