Attention, intentions, and the structure of discourse
Computational Linguistics
Interpreting anaphors in natural language texts
Interpreting anaphors in natural language texts
An algorithm for pronominal anaphora resolution
Computational Linguistics
A corpus-based evaluation of centering and pronoun resolution
Computational Linguistics - Special issue on computational anaphora resolution
A machine learning approach to coreference resolution of noun phrases
Computational Linguistics - Special issue on computational anaphora resolution
An architecture for anaphora resolution
ANLC '88 Proceedings of the second conference on Applied natural language processing
Robust pronoun resolution with limited knowledge
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
Never look back: an alternative to centering
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
A centering approach to pronouns
ACL '87 Proceedings of the 25th annual meeting on Association for Computational Linguistics
Automatic processing of large corpora for the resolution of anaphora references
COLING '90 Proceedings of the 13th conference on Computational linguistics - Volume 3
Anaphora resolution: a multi-strategy approach
COLING '88 Proceedings of the 12th conference on Computational linguistics - Volume 1
Improving machine learning approaches to coreference resolution
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Named entity recognition using an HMM-based chunk tagger
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Error-driven HMM-based chunk tagger with context-dependent lexicon
EMNLP '00 Proceedings of the 2000 Joint SIGDAT conference on Empirical methods in natural language processing and very large corpora: held in conjunction with the 38th Annual Meeting of the Association for Computational Linguistics - Volume 13
Employing the centering theory in pronoun resolution from the semantic perspective
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 2 - Volume 2
A new parallel association rule mining algorithm on distributed shared memory system
International Journal of Business Intelligence and Data Mining
Deterministic coreference resolution based on entity-centric, precision-ranked rules
Computational Linguistics
Hi-index | 0.00 |
This paper presents a constraint-based multi-agent strategy to coreference resolution of general noun phrases in unrestricted English text. For a given anaphor and all the preceding referring expressions as the antecedent candidates, a common constraint agent is first presented to filter out invalid antecedent candidates using various kinds of general knowledge. Then, according to the type of the anaphor, a special constraint agent is proposed to filter out more invalid antecedent candidates using constraints which are derived from various kinds of special knowledge. Finally, a simple preference agent is used to choose an antecedent for the anaphor form the remaining antecedent candidates, based on the proximity principle. One interesting observation is that the most recent antecedent of an anaphor in the coreferential chain is sometimes indirectly linked to the anaphor via some other antecedents in the chain. In this case, we find that the most recent antecedent always contains little information to directly determine the coreference relationship with the anaphor. Therefore, for a given anaphor, the corresponding special constraint agent can always safely filter out these less informative antecedent candidates. In this way, rather than finding the most recent antecedent for an anaphor, our system tries to find the most direct and informative antecedent. Evaluation shows that our system achieves Precision / Recall / F-measures of 84.7% / 65.8% / 73.9 and 82.8% / 55.7% / 66.5 on MUC-6 and MUC-7 English coreference tasks respectively. This means that our system achieves significantly better precision rates by about 8 percent over the best-reported systems while keeping recall rates.