Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
A maximum-entropy-inspired parser
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
Motivations and methods for text simplification
COLING '96 Proceedings of the 16th conference on Computational linguistics - Volume 2
Helping aphasic people process online information
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Syntactic simplification for improving content selection in multi-document summarization
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
Hi-index | 0.00 |
In this paper we investigate the Presentational Relative Clause (PRC) construction. In both the linguistic and NLP literature, relative clauses have been considered to contain background information that is not directly relevant or highly useful in semantic analysis. In text summarization in particular, the information contained in the relative clauses is often removed, being viewed as non-central content to the topic or discourse. We discuss the importance of distinguishing the PRC construction from other relative clause types. We show that in the PRC, the relative clause, rather than the main clause, contains the assertion of the utterance. Based on linguistic analysis, we suggest informative features that may be used in automatic extraction of PRC constructions. We believe that identifying this construction will be useful in discriminating central information from peripheral.