Dynamic defocus and occlusion compensation of projected imagery by model-based optimal projector selection in multi-projection environment

  • Authors:
  • Momoyo Nagase;Daisuke Iwai;Kosuke Sato

  • Affiliations:
  • Osaka University, Graduate School of Engineering Science, 1-3-D554, Machikaneyama, 560-8531, Toyonaka, Osaka, Japan;Osaka University, Graduate School of Engineering Science, 1-3-D554, Machikaneyama, 560-8531, Toyonaka, Osaka, Japan;Osaka University, Graduate School of Engineering Science, 1-3-D554, Machikaneyama, 560-8531, Toyonaka, Osaka, Japan

  • Venue:
  • Virtual Reality - Special Issue on Augmented Reality
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a novel model-based approach of dynamic defocus and occlusion compensation method in a multi-projection environment. Conventional defocus compensation research applies appearance-based method, which needs a point spread function (PSF) calibration when either position or orientation of an object to be projected is changed, thus cannot be applied to interactive applications in which the object dynamically moves. On the other hand, we propose a model-based method in which PSF and geometric calibrations are required only once in advance, and projector’s PSF is computed online based on geometric relationship between the projector and the object without any additional calibrations. We propose to distinguish the oblique blur (loss of high-spatial-frequency components according to the incidence angle of the projection light) from the defocus blur and to introduce it to the PSF computation. For each part of the object surfaces, we select an optimal projector that preserves the largest amount of high-spatial-frequency components of the original image to realize defocus-free projection. The geometric relationship can also be used to eliminate the cast shadows of the projection images in multi-projection environment. Our method is particularly useful in the interactive systems because the movement of the object (consequently geometric relationship between each projector and the object) is usually measured by an attached tracking sensor. This paper describes details about the proposed approach and a prototype implementation. We performed two proof-of-concept experiments to show the feasibility of our approach.