Warping trajectories for video synchronization

  • Authors:
  • Sukrit Shankar;Joan Lasenby;Anil Kokaram

  • Affiliations:
  • Cambridge University, Cambridge, United Kingdom;Cambridge University, Cambridge, United Kingdom;Google, Inc. & Trinity College, Dublin, Mountain View , CA, USA

  • Venue:
  • Proceedings of the 4th ACM/IEEE international workshop on Analysis and retrieval of tracked events and motion in imagery stream
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Temporal synchronization of multiple video recordings of the same dynamic event is a critical task in many computer vision applications e.g. novel view synthesis and 3D reconstruction. Typically this information is implied, since recordings are made using the same timebase, or time-stamp information is embedded in the video streams. Recordings using consumer grade equipment do not contain this information; hence, there is a need to temporally synchronize signals using the visual information itself. Previous work in this area has either assumed good quality data with relatively simple dynamic content or the availability of precise camera geometry. In this paper, we propose a technique which exploits feature trajectories across views in a novel way, and specifically targets the kind of complex content found in consumer generated sports recordings, without assuming precise knowledge of fundamental matrices or homographies. Our method automatically selects the moving feature points in the two unsynchronized videos whose 2D trajectories can be best related, thereby helping to infer the synchronization index. We evaluate performance using a number of real recordings and show that synchronization can be achieved to within 1 sec, which is better than previous approaches.