Three new probabilistic models for dependency parsing: an exploration
COLING '96 Proceedings of the 16th conference on Computational linguistics - Volume 1
Japanese dependency analysis using cascaded chunking
COLING-02 proceedings of the 6th conference on Natural language learning - Volume 20
Online large-margin training of dependency parsers
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Deterministic dependency parsing of English text
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
Algorithms for deterministic incremental dependency parsing
Computational Linguistics
Japanese dependency parsing using a tournament model
COLING '08 Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
An efficient algorithm for easy-first non-directional dependency parsing
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Head-driven transition-based parsing with top-down prediction
ACL '12 Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers - Volume 1
Hi-index | 0.00 |
Nivre's method was improved by enhancing deterministic dependency parsing through application of a tree-based model. The model considers all words necessary for selection of parsing actions by including words in the form of trees. It chooses the most probable head candidate from among the trees and uses this candidate to select a parsing action. In an evaluation experiment using the Penn Treebank (WSJ section), the proposed model achieved higher accuracy than did previous deterministic models. Although the proposed model's worst-case time complexity is O(n2), the experimental results demonstrated an average parsing time not much slower than O(n).