Stereo geometry from 3D ego-motion streams

  • Authors:
  • F. Dornaika;C. R. Chung

  • Affiliations:
  • Dept. of Autom. & Comput.-Aided Eng., Chinese Univ. of Hong Kong, Shatin, China;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the problem of geometry determination of a stereo rig that undergoes general rigid motions. Neither known reference objects nor stereo correspondence are required. With almost no exception, all existing online solutions attempt to recover stereo geometry by first establishing stereo correspondences. We first describe a mathematical framework that allows us to solve for stereo geometry, i.e., the rotation and translation between the two cameras, using only motion correspondence that is far easier to acquire than stereo correspondence. Second, we show how to recover the rotation and present two linear methods, as well as a nonlinear one to solve for the translation. Third, we perform a stability study for the developed methods in the presence of image noise, camera parameter noise, and ego-motion noise. We also address accuracy issues. Experiments with real image data are presented. The work allows the concept of online calibration to be broadened, as it is no longer true that only single cameras can exploit structure-from-motion strategies; even the extrinsic parameters of a stereo rig of cameras can do so without solving stereo correspondence. The developed framework is applicable for estimating the relative three-dimensional (3D) geometry associated with a wide variety of mounted devices used in vision and robotics, by exploiting their scaled ego-motion streams.