Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition
Three new probabilistic models for dependency parsing: an exploration
COLING '96 Proceedings of the 16th conference on Computational linguistics - Volume 1
A statistical parser for Czech
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
Head-Driven Statistical Models for Natural Language Parsing
Computational Linguistics
Online large-margin training of dependency parsers
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Hi-index | 0.00 |
In the past years much research has been done on data-driven dependency parsing and performance has increased steadily. Dependency grammar has an important inherent characteristic, that is, the nodes closer to root usually make more contribution to audiences than the others. However, that is ignored in previous research in which every node in a dependency structure is considered to play the same role. In this paper a parser based on weighted hierarchical structure learning is proposed to simulate shallow dependency parsing, which has the preference for nodes closer to root during learning. The experimental results show that the accuracies of nodes closer to root are improved at the cost of a little decrease of accuracies of nodes far from root.