A cross-media evolutionary timeline generation framework based on iterative recommendation

  • Authors:
  • Shize Xu;Liang Kong;Yan Zhang

  • Affiliations:
  • Department of Machine Intelligence, Peking University, Beijing, China;Department of Machine Intelligence, Peking University, Beijing, China;Department of Machine Intelligence, Peking University, Beijing, China

  • Venue:
  • Proceedings of the 3rd ACM conference on International conference on multimedia retrieval
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Summarization methods such as timelines have greatly helped people to understand all kinds of news events within limited time. However, there are few studies probing into cross-media summarization, for example, generating timelines which contain both texts and images that can reinforce each other. In this paper, we tackle this important and challenging problem by proposing a novel solution. Specifically, we first reveal three requisite characteristics of an ideal image-text timeline. With the idea of recommendation, all these requisites will be modeled respectively, and fused compactly in a unified cross-media framework. Finally, we put all sentences and images into either the schema of referrer or the schema of recommended candidate, and the former recommends the latter. After changing their roles iteratively, we can achieve the optimal timelines which will significantly improve user experience and satisfaction. Experiments on real-world datasets show that the timelines generated by our framework outperform several competitive baselines.