Connectionist-inspired incremental PCFG parsing

  • Authors:
  • Marten van Schijndel;Andy Exley;William Schuler

  • Affiliations:
  • The Ohio State University;University of Minnesota;The Ohio State University

  • Venue:
  • CMCL '12 Proceedings of the 3rd Workshop on Cognitive Modeling and Computational Linguistics
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Probabilistic context-free grammars (PCFGs) are a popular cognitive model of syntax (Jurafsky, 1996). These can be formulated to be sensitive to human working memory constraints by application of a right-corner transform (Schuler, 2009). One side-effect of the transform is that it guarantees at most a single expansion (push) and at most a single reduction (pop) during a syntactic parse. The primary finding of this paper is that this property of right-corner parsing can be exploited to obtain a dramatic reduction in the number of random variables in a probabilistic sequence model parser. This yields a simpler structure that more closely resembles existing simple recurrent network models of sentence comprehension.