Compilers: principles, techniques, and tools
Compilers: principles, techniques, and tools
Prolog and natural-language analysis
Prolog and natural-language analysis
Transition network grammars for natural language analysis
Communications of the ACM
EACL '85 Proceedings of the second conference on European chapter of the Association for Computational Linguistics
ACL '91 Proceedings of the 29th annual meeting on Association for Computational Linguistics
Practical experiments with regular approximation of context-free languages
Computational Linguistics - Special issue on finite-state methods in NLP
Finite-state approximation of constraint-based grammars using left-corner grammar transforms
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 1
Compact non-left-recursive grammars using the selective left-corner transform and factoring
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Incremental interpretation: applications, theory, and relationship to dynamic semantics
COLING '94 Proceedings of the 15th conference on Computational linguistics - Volume 2
Syntactic complexity measures for detecting mild cognitive impairment
BioNLP '07 Proceedings of the Workshop on BioNLP 2007: Biological, Translational, and Clinical Language Processing
Modeling sentence processing in ACT-R
IncrementParsing '04 Proceedings of the Workshop on Incremental Parsing: Bringing Engineering and Cognition Together
Heuristic search in a cognitive model of human parsing
IWPT '09 Proceedings of the 11th International Conference on Parsing Technologies
Broad-coverage parsing using human-like memory constraints
Computational Linguistics
HHMM parsing with limited parallelism
CMCL '10 Proceedings of the 2010 Workshop on Cognitive Modeling and Computational Linguistics
Incremental, predictive parsing with psycholinguistically motivated tree-adjoining grammar
Computational Linguistics
Hi-index | 0.00 |
It is well known that even extremely limited centerembedding causes people to have difficulty in comprehension, but that left- and right-branching constructions produce no such effect. If the difficulty in comprehension is taken to be a result of processing load, as is widely assumed, then measuring the processing load induced by a parsing strategy on these constructions may help determine its plausibility as a psychological model. On this basis, it has been argued [AJ91, JL83] that by identifying processing load with space utilization, we can rule out both top-down and bottom-up parsing as viable candidates for the human sentence processing mechanism, and that left-corner parsing represents a plausible alternative.Examining their arguments in detail, we find difficulties with each presentation. In this paper we revise the argument and validate its central claim. In so doing, we discover that the key distinction between the parsing methods is not the form of prediction (top-down vs. bottom-up vs. left-corner), but rather the ability to instantiate the operation of composition.