Synchronized ego-motion recovery of two face-to-face cameras

  • Authors:
  • Jinshi Cui;Yasushi Yagi;Hongbin Zha;Yasuhiro Mukaigawa;Kazuaki Kondo

  • Affiliations:
  • State Key Lab on Machine Perception, Peking University, China;Department of Intelligent Media, Osaka University, Japan;State Key Lab on Machine Perception, Peking University, China;Department of Intelligent Media, Osaka University, Japan;Department of Intelligent Media, Osaka University, Japan

  • Venue:
  • ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part I
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

A movie captured by a wearable camera affixed to an actor's body gives audiences the sense of "immerse in the movie". The raw movie captured by wearable camera needs stabilization with jitters due to ego-motion. However, conventional approaches often fail in accurate ego-motion estimation when there are moving objects in the image and no sufficient feature pairs provided by background region. To address this problem, we proposed a new approach that utilizes an additional synchronized video captured by the camera attached on the foreground object (another actor). Formally we configure above sensor system as two face-to-face moving cameras. Then we derived the relations between four views including two consecutive views from each camera. The proposed solution has two steps. Firstly we calibrate the extrinsic relationship of two cameras with an AX=XB formulation, and secondly estimate the motion using calibration matrix. Experiments verify that this approach can recover from failures of conventional approach and provide acceptable stabilization results for real data.