Event driven summarization for web videos

  • Authors:
  • Richang Hong;Jinhui Tang;Hung-Khoon Tan;Shuicheng Yan;Chongwah Ngo;Tat-Seng Chua

  • Affiliations:
  • School of Computing, National University of Singapore, Singapore, Singapore;School of Computing, National University of Singapore, Singapore, Singapore;Dept. of CS, City University of HongKong, HongKong, China;Dept. of ECE, National University of Singapore, Singapore, Singapore;Dept. of CS, City University of HongKong, HongKong, China;School of Computing, National University of Singapore, Singapore, Singapore

  • Venue:
  • WSM '09 Proceedings of the first SIGMM workshop on Social media
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The explosive growth of web videos brings out the challenge of how to efficiently browse hundreds or even thousands of videos at a glance. Given an event-driven query, social media web sites can easily return a ranked list of large but diverse and somewhat noisy videos. Users often need to painstakingly explore the retrieved list for an overview of the event. This paper presents a novel solution by mining and threading "key" shots, which can provide an overview of main contents of videos at a glance, by summarizing a large set of diverse videos. The proposed framework comprises three stages for multi-video summarization. Firstly, given an event query, a ranked list of web videos together with their associated tags are retrieved. Key shots are then established by near-duplicate keyframe detection, ranked according to informativeness and threaded in a chronological order. Finally, summarization is formulated as an optimization procedure which trades off between relevance of key shots and user-defined skimming ratio. The framework provides the summary with the way of dynamic video skimming. We conduct user studies on twelve event queries for over hundred hours of videos crawled from YouTube. The evaluation demonstrates the feasibility and effectiveness of the proposed solution.