A machine learning approach to coreference resolution of noun phrases
Computational Linguistics - Special issue on computational anaphora resolution
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Improving machine learning approaches to coreference resolution
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Introduction to the CoNLL-2003 shared task: language-independent named entity recognition
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Maximum entropy models for named entity recognition
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Exploring various knowledge in relation extraction
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Exploring syntactic features for relation extraction using a convolution tree kernel
HLT-NAACL '06 Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics
Linear Programming Relaxations and Belief Propagation -- An Empirical Study
The Journal of Machine Learning Research
The CoNLL-2008 shared task on joint parsing of syntactic and semantic dependencies
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
Design challenges and misconceptions in named entity recognition
CoNLL '09 Proceedings of the Thirteenth Conference on Computational Natural Language Learning
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
Understanding the value of features for coreference resolution
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Joint inference in information extraction
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Joint parsing and named entity recognition
NAACL '09 Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics
A machine learning approach to building domain-specific search engines
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Piecewise training for structured prediction
Machine Learning
Simple coreference resolution with rich syntactic and semantic features
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 3 - Volume 3
Joint parsing and semantic role labeling
CONLL '05 Proceedings of the Ninth Conference on Computational Natural Language Learning
Coreference resolution in a modular, entity-centered model
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Joint inference for knowledge extraction from biomedical literature
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Joint entity and relation extraction using card-pyramid parsing
CoNLL '10 Proceedings of the Fourteenth Conference on Computational Natural Language Learning
Collective cross-document relation extraction without labelled data
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
COLING '10 Proceedings of the 23rd International Conference on Computational Linguistics: Posters
Semi-supervised relation extraction with large-scale word clustering
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies - Volume 1
Fast and robust joint models for biomedical event extraction
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Although joint inference is an effective approach to avoid cascading of errors when inferring multiple natural language tasks, its application to information extraction has been limited to modeling only two tasks at a time, leading to modest improvements. In this paper, we focus on the three crucial tasks of automated extraction pipelines: entity tagging, relation extraction, and coreference. We propose a single, joint graphical model that represents the various dependencies between the tasks, allowing flow of uncertainty across task boundaries. Since the resulting model has a high tree-width and contains a large number of variables, we present a novel extension to belief propagation that sparsifies the domains of variables during inference. Experimental results show that our joint model consistently improves results on all three tasks as we represent more dependencies. In particular, our joint model obtains 12% error reduction on tagging over the isolated models.