OTESC: online transformation estimation between stereo cameras

  • Authors:
  • Tiemen Schreuder;Emile A. Hendriks;André Redert

  • Affiliations:
  • Delft University of Technology, Delft, Netherlands;Delft University of Technology, Delft, Netherlands;Philips Research Europe, Eindhoven, Netherlands

  • Venue:
  • Proceedings of the 1st international workshop on 3D video processing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we describe a system that performs online transformation estimation between pre-calibrated stereo cameras. This allows the stereo cameras to be moved around and automatically re-calibrated without the use of a calibration object. This also allows the set-up to recover from accidental nudges that invalidate the extrinsic (external to the stereo camera) calibration. The obtained transformations can be used in virtual view rendering for 3D Video. The relative positions and orientations of the stereo cameras are obtained using sparse point correspondences found in different views of the scene. For each stereo camera, 3D coordinates of salient scene points are triangulated and their image feature descriptors are used to locate the same points in the views of other stereo cameras. The salient point descriptors SIFT and SURF are evaluated for this purpose. Given enough salient image points, the proposed solution accurately finds the transformation between stereo camera pairs with a reprojection error less than 1 pixel.