A deterministic word dependency analyzer enhanced with preference learning

  • Authors:
  • Hideki Isozaki;Hideto Kazawa;Tsutomu Hirao

  • Affiliations:
  • NTT Corporation, Hikaridai, Seikacho, Sourakugun, Kyoto, Japan;NTT Corporation, Hikaridai, Seikacho, Sourakugun, Kyoto, Japan;NTT Corporation, Hikaridai, Seikacho, Sourakugun, Kyoto, Japan

  • Venue:
  • COLING '04 Proceedings of the 20th international conference on Computational Linguistics
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Word dependency is important in parsing technology. Some applications such as Information Extraction from biological documents benefit from word dependency analysis even without phrase labels. Therefore, we expect an accurate dependency analyzer trainable without using phrase labels is useful. Although such an English word dependency analyzer was proposed by Yamada and Matsumoto, its accuracy is lower than state-of-the-art phrase structure parsers because of the lack of top-down information given by phrase labels. This paper shows that the dependency analyzer can be improved by introducing a Root-Node Finder and a Prepositional-Phrase Attachment Resolver. Experimental results show that these modules based on Preference Learning give better scores than Collins' Model 3 parser for these subproblems. We expect this method is also applicable to phrase structure parsers.