Online Estimation of Trifocal Tensors for Augmenting Live Video

  • Authors:
  • Jia Li;Robert Laganiere;Gerhard Roth

  • Affiliations:
  • University of Ottawa, Canada;University of Ottawa, Canada;National Research Council, Canada

  • Venue:
  • ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a method to augment live video based on the tracking of natural features, and the online estimation of the trinocular geometry. Previous without-marker approaches require the computation of camera pose to render virtual objects. The strength of our proposed method is that it doesn't require tracking of camera pose, and exploits the usual advantages of marker-based approaches for a fast implementation. A 3-view AR system is used to demonstrate our approach. It consists of an uncalibrated camera that moves freely inside the scene of interest, and of three reference frames taken at the time of system initialization. As the camera is moving, image features taken from an initial triplet set are tracked throughout the video sequence. And the trifocal tensor associated with each frame is estimated online. With this tensor, the square pattern that was visible in the reference frames is transferred to the video. This invisible pattern is then used by the ARToolkit to embed virtual objects.