Learning to Parse Natural Language with Maximum Entropy Models
Machine Learning - Special issue on natural language learning
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
A maximum-entropy-inspired parser
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
Shallow parsing with conditional random fields
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
EMNLP '02 Proceedings of the ACL-02 conference on Empirical methods in natural language processing - Volume 10
Coarse-to-fine n-best parsing and MaxEnt discriminative reranking
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Speeding up full syntactic parsing by leveraging partial parsing decisions
COLING-ACL '06 Proceedings of the COLING/ACL on Main conference poster sessions
Linear complexity context-free parsing pipelines via chart constraints
NAACL '09 Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Cube pruning as heuristic search
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Learning translation boundaries for phrase-based decoding
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
A fast decoder for joint word segmentation and POS-tagging using a single discriminative model
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
Fast and accurate arc filtering for dependency parsing
COLING '10 Proceedings of the 23rd International Conference on Computational Linguistics
Beam-width prediction for efficient context-free parsing
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies - Volume 1
Joint training of dependency parsing filters through latent support vector machines
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies: short papers - Volume 2
Unary constraints for efficient context-free parsing
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies: short papers - Volume 2
Vine pruning for efficient multi-pass dependency parsing
NAACL HLT '12 Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
MSR SPLAT, a language analysis toolkit
NAACL HLT '12 Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Demonstration Session
Generalized higher-order dependency parsing with cube pruning
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Finite-state chart constraints for reduced complexity context-free parsing pipelines
Computational Linguistics
Hi-index | 0.00 |
In this paper, we consider classifying word positions by whether or not they can either start or end multi-word constituents. This provides a mechanism for "closing" chart cells during context-free inference, which is demonstrated to improve efficiency and accuracy when used to constrain the well-known Charniak parser. Additionally, we present a method for "closing" a sufficient number of chart cells to ensure quadratic worst-case complexity of context-free inference. Empirical results show that this O(n2) bound can be achieved without impacting parsing accuracy.