Complete Calibration of a Multi-camera Network

  • Authors:
  • Patrick Baker;Yiannis Aloimonos

  • Affiliations:
  • -;-

  • Venue:
  • OMNIVIS '00 Proceedings of the IEEE Workshop on Omnidirectional Vision
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a calibration procedure for a multi-camera rig. Consider a large number of synchronized cameras arranged in some space, for example, on the walls of a room looking inwards. It is not necessary for all the cameras to have a common field of view, as long as every camera is connected to every other camera through common fields of view. Switching off the lights and waving a wand with an LED at the end of it, we can capture a very large set of point correspondences (corresponding points are captured at the same time stamp). The correspondences are then used in a large, nonlinear eigenvalue minimization routine whose basis is the epipolar constraint. The eigenvalue matrix encapsulates all points correspondences between every pair of cameras in a way that minimizing the smallest eigenvalue results in the projection matrices, to within a single perspective transformation. In a second step, given additional data from waving a rod with two LEDs (one at each end) the full projection matrices are calculated. The method is extremely accurate---the reprojections of the reconstructed points were within a pixel.