A spatio-temporal pyramid matching for video retrieval

  • Authors:
  • Jaesik Choi;Ziyu Wang;Sang-Chul Lee;Won J. Jeon

  • Affiliations:
  • University of Illinois at Urbana-Champaign, 201 N. Goodwin Avenue, Urbana, IL 61801, USA;Inha University, 1103 High-tech Center, Yonghyun-dong 253, Nam-gu, Incheon, Republic of Korea;Inha University, 1103 High-tech Center, Yonghyun-dong 253, Nam-gu, Incheon, Republic of Korea;Samsung Research America - Silicon Valley, 75 West Plumeria Drive, San Jose, CA 95134, USA

  • Venue:
  • Computer Vision and Image Understanding
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

An efficient video retrieval system is essential to search relevant video contents from a large set of video clips, which typically contain several heterogeneous video clips to match with. In this paper, we introduce a content-based video matching system that finds the most relevant video segments from video database for a given query video clip. Finding relevant video clips is not a trivial task, because objects in a video clip can constantly move over time. To perform this task efficiently, we propose a novel video matching called Spatio-Temporal Pyramid Matching (STPM). Considering features of objects in 2D space and time, STPM recursively divides a video clip into a 3D spatio-temporal pyramidal space and compares the features in different resolutions. In order to improve the retrieval performance, we consider both static and dynamic features of objects. We also provide a sufficient condition in which the matching can get the additional benefit from temporal information. The experimental results show that our STPM performs better than the other video matching methods.