Video synchronization based on events alignment

  • Authors:
  • Yiguang Liu;Menglong Yang;Zhisheng You

  • Affiliations:
  • Vision and Image processing Lab, Sichuan University, Chengdu 610064, PR China and School of Computer Science and Engineering, Sichuan University, Chengdu 610064, PR China;Vision and Image processing Lab, Sichuan University, Chengdu 610064, PR China and Key Laboratory of Fundamental Synthetic Vision Graphics and Image for National Defense, Sichuan University, Chengd ...;Key Laboratory of Fundamental Synthetic Vision Graphics and Image for National Defense, Sichuan University, Chengdu 610064, PR China and School of Computer Science and Engineering, Sichuan Univers ...

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2012

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper presents a method of synchronizing two video sequences. The changes of kinematic status of feature points are considered as events. The basic idea of this paper is to temporally align these events observed in the two cameras by using an algorithm to score each candidate event correspondence, such that each false correspondence with a lower score could be discarded. Then the recovered event correspondences are obtained and they can be used to coarsely estimate synchronization parameters via the Hough transform. Finally refine these parameters by solving an optimization problem in order to recover synchronization to sub-frame accuracy. The method is evaluated quantitatively using synthetic sequences and demonstrated qualitatively on several real sequences. Experiment results show that the method is applicable to multiple features case, single feature case, different frame rates case and even the case of single feature with the two cameras relative motion.