ESM-Blur: Handling & rendering blur in 3D tracking and augmentation

  • Authors:
  • Youngmin Park;Vincent Lepetit;Woontack Woo

  • Affiliations:
  • GIST, U-VR Lab, Korea;EPFL, CVLab, Korea;GIST, U-VR Lab, Korea

  • Venue:
  • ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly considers the possibility of blur, and show it results in a generalization of the original ESM algorithm. This allows to converge faster, more accurately and more robustly even under large amount of blur. Our second contribution is an efficient method for rendering the virtual objects under the estimated motion blur. It renders two images of the object under 3D perspective, and warps them to create many intermediate images. By fusing these images we obtain a final image for the virtual objects blurred consistently with the captured image. Because warping is much faster that 3D rendering, we can create realistically blurred images at a very low computational cost.