Interactive search in large video collections

  • Authors:
  • Andreas Girgensohn;John Adcock;Matthew Cooper;Lynn Wilcox

  • Affiliations:
  • FX Palo Alto Laboratory, Palo Alto, CA;FX Palo Alto Laboratory, Palo Alto, CA;FX Palo Alto Laboratory, Palo Alto, CA;FX Palo Alto Laboratory, Palo Alto, CA

  • Venue:
  • CHI '05 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a search interface for large video collections with time-aligned text transcripts. The system is designed for users such as intelligence analysts that need to quickly find video clips relevant to a topic expressed in text and images. A key component of the system is a powerful and flexible user interface that incorporates dynamic visualizations of the underlying multimedia objects. The interface displays search results in ranked sets of story keyframe collages, and lets users explore the shots in a story. By adapting the keyframe collages based on query relevance and indicating which portions of the video have already been explored, we enable users to quickly find relevant sections. We tested our system as part of the NIST TRECVID interactive search evaluation, and found that our user interface enabled users to find more relevant results within the allotted time than those of many systems employing more sophisticated analysis techniques.