Mildly non-projective dependency structures
COLING-ACL '06 Proceedings of the COLING/ACL on Main conference poster sessions
Algorithms for deterministic incremental dependency parsing
Computational Linguistics
Multilingual dependency analysis with a two-stage discriminative parser
CoNLL-X '06 Proceedings of the Tenth Conference on Computational Natural Language Learning
Labeled pseudo-projective dependency parsing with support vector machines
CoNLL-X '06 Proceedings of the Tenth Conference on Computational Natural Language Learning
On the complexity of non-projective data-driven dependency parsing
IWPT '07 Proceedings of the 10th International Conference on Parsing Technologies
Non-projective dependency parsing in expected linear time
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 1 - Volume 1
A fast, accurate, non-projective, semantically-enriched parser
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Preliminary experiments in polish dependency parsing
SIIS'11 Proceedings of the 2011 international conference on Security and Intelligent Information Systems
The best of both worlds: a graph-based completion model for transition-based parsers
EACL '12 Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics
MaltOptimizer: an optimization tool for MaltParser
EACL '12 Proceedings of the Demonstrations at the 13th Conference of the European Chapter of the Association for Computational Linguistics
Multilingual joint parsing of syntactic and semantic dependencies with a latent variable model
Computational Linguistics
Hi-index | 0.00 |
We present an improved training strategy for dependency parsers that use online reordering to handle non-projective trees. The new strategy improves both efficiency and accuracy by reducing the number of swap operations performed on non-projective trees by up to 80%. We present state-of-the-art results for five languages with the best ever reported results for Czech.