News thread extraction based on topical n-gram model with a background distribution

  • Authors:
  • Zehua Yan;Fang Li

  • Affiliations:
  • Department of Computer Science and Engineering, Shanghai Jiao Tong University, China;Department of Computer Science and Engineering, Shanghai Jiao Tong University, China

  • Venue:
  • ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Automatic thread extraction for news events can help people know different aspects of a news event. In this paper, we present a method of extraction using a topical N-gram model with a background distribution (TNB). Unlike most topic models, such as Latent Dirichlet Allocation (LDA), which relies on the bag-of-words assumption, our model treats words in their textual order. Each news report is represented as a combination of a background distribution over the corpus and a mixture distribution over hidden news threads. Thus our model can model “presidential election” of different years as a background phrase and “Obama wins” as a thread for event “2008 USA presidential election”. We apply our method on two different corpora. Evaluation based on human judgment shows that the model can generate meaningful and interpretable threads from a news corpus.