A convenient multicamera self-calibration for virtual environments

  • Authors:
  • Tomás Svoboda;Daniel Martinec;Tomás Pajdla

  • Affiliations:
  • Faculty of Electrical Engineering, Czech Technical University, Prague, Czech Republic;Faculty of Electrical Engineering, Czech Technical University, Prague, Czech Republic;Faculty of Electrical Engineering, Czech Technical University, Prague, Czech Republic

  • Venue:
  • Presence: Teleoperators and Virtual Environments
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Virtual immersive environments or telepresence setups often consist of multiple cameras that have to be calibrated, We present a convenient method for doing this. The minimum is three cameras, but there is no upper limit. The method is fully automatic and a freely moving bpdght spot is the only calibration object, A set of virtual 3D points is made by waving the bright spot through the working volume. Its projections are found with subpixel precision and verified by a robust RANSAC analysis. The cameras do not have to see all points; only reasonable overlap between camera subgroups is necessary. Projective structures are computed via rank-4 factorization and the Euclidean stratification is done by imposing geometric constraints. This linear estimate initializes a postprocessing computation of nonlinear distortion, which is also fully automatic. We suggest a trick on how to use a very ordinary laser pointer as the calibration object. We show that it is possible to calibrate an immersive virtual environment with 16 cameras in less than 60 minutes reaching about 1/5 pixel reprojection error, The method has been successfully tested on numerous multicamera environments using varying numbers of cameras of varying quality.