Modeling latent-dynamic in shallow parsing: a latent conditional model with improved inference

  • Authors:
  • Xu Sun;Louis-Philippe Morency;Daisuke Okanohara;Jun'ichi Tsujii

  • Affiliations:
  • The University of Tokyo, Hongo, Tokyo, Japan;USC Institute for Creative Technologies, Marina del Rey;The University of Tokyo, Hongo, Tokyo, Japan;The University of Tokyo, Hongo, Tokyo, Japan and The University of Manchester, Manchester, UK

  • Venue:
  • COLING '08 Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Shallow parsing is one of many NLP tasks that can be reduced to a sequence labeling problem. In this paper we show that the latent-dynamics (i.e., hidden substructure of shallow phrases) constitutes a problem in shallow parsing, and we show that modeling this intermediate structure is useful. By analyzing the automatically learned hidden states, we show how the latent conditional model explicitly learn latent-dynamics. We propose in this paper the Best Label Path (BLP) inference algorithm, which is able to produce the most probable label sequence on latent conditional models. It outperforms two existing inference algorithms. With the BLP inference, the LDCRF model significantly outperforms CRF models on word features, and achieves comparable performance of the most successful shallow parsers on the CoNLL data when further using part-of-speech features.