Combining flat and structured approaches for temporal slot filling or: how much to compress?

  • Authors:
  • Qi Li;Javier Artiles;Taylor Cassidy;Heng Ji

  • Affiliations:
  • Computer Science Department and Linguistics Department, Queens College and Graduate Center, City University of New York, New York, NY;Computer Science Department and Linguistics Department, Queens College and Graduate Center, City University of New York, New York, NY;Computer Science Department and Linguistics Department, Queens College and Graduate Center, City University of New York, New York, NY;Computer Science Department and Linguistics Department, Queens College and Graduate Center, City University of New York, New York, NY

  • Venue:
  • CICLing'12 Proceedings of the 13th international conference on Computational Linguistics and Intelligent Text Processing - Volume Part II
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present a hybrid approach to Temporal Slot Filling (TSF) task. Our method decomposes the task into two steps: temporal classification and temporal aggregation. As in many other NLP tasks, a key challenge lies in capturing relations between text elements separated by a long context. We have observed that features derived from a structured text representation can help compressing the context and reducing ambiguity. On the other hand, surface lexical features are more robust and work better in some cases. Experiments on the KBP2011 temporal training data set show that both surface and structured approaches outperform a baseline bag-of-word based classifier and the proposed hybrid method can further improve the performance significantly. Our system achieved the top performance in KBP2011 evaluation.