Robot vision
Surface Reflection: Physical and Geometrical Perspectives
IEEE Transactions on Pattern Analysis and Machine Intelligence
Merging virtual objects with the real world: seeing ultrasound imagery within the patient
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
Improving static and dynamic registration in an optical see-through HMD
SIGGRAPH '94 Proceedings of the 21st annual conference on Computer graphics and interactive techniques
Superior augmented reality registration by integrating landmark tracking and magnetic tracking
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
A progressive refinement approach to fast radiosity image generation
SIGGRAPH '88 Proceedings of the 15th annual conference on Computer graphics and interactive techniques
Interactive Common Illumination for Computer Augmented Reality
Proceedings of the Eurographics Workshop on Rendering Techniques '97
A Stereo Machine for Video-Rate Dense Depth Mapping and Its New Applications
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
Hi-index | 0.00 |
This paper describes a new method for estimating an illumination distribution of a real scene. Shadows in a real scene are usually observed as soft shadows that do not have sharp edges. In the proposed method, illumination distribution of the real scene is estimated based on radiance distribution inside the soft shadows cast by an object in the scene. By observing shadows and not illumination itself, the proposed method is able to avoid several technical problems which the previously proposed methods suffered from: how to capture a wide field of view of the entire scene and how to capture a high dynamic range of the illumination. The estimated illumination distribution is then used for rendering virtual objects superimposed onto images of the real scene. We successfully tested the proposed method by using real images to demonstrate its effectiveness.