Frame-level temporal calibration of video sequences from unsynchronized cameras

  • Authors:
  • Senem Velipasalar;Wayne H. Wolf

  • Affiliations:
  • University of Nebraska-Lincoln, Department of Electrical Engineering, 209N WSEC, 68588, Lincoln, NE, USA;Georgia Institute of Technology, Van Leer Electrical Engineering Building, 777 Atlantic Drive NW, 30332-0250, Atlanta, GA, USA

  • Venue:
  • Machine Vision and Applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a method for temporally calibrating video sequences from unsynchronized cameras by image processing operations, and presents two search algorithms to match and align trajectories across different camera views. Existing multi-camera systems assume that input video sequences are synchronized either by genlock or by time stamp information and a centralized server. Yet, hardware-based synchronization increases installation cost. Hence, using image information is necessary to align frames from the cameras whose clocks are not synchronized. The system built for temporal calibration is composed of three modules: object tracking module, calibration data extraction module, and the search module. A robust and efficient search algorithm is introduced that recovers the frame offset by matching the trajectories in different views, and finding the most reliable match. Thanks to information obtained from multiple trajectories, this algorithm is robust to possible errors in background subtraction and location extraction. Moreover, the algorithm can handle very large frame offsets. A RANdom SAmple Consensus (RANSAC) based version of this search algorithm is also introduced. Results obtained with different video sequences are presented, which show the robustness of the algorithms in recovering various range of frame offsets for video sequences with varying levels of object activity.