Texture Extraction and Blending without Prior Knowledge of Lighting Conditions

  • Authors:
  • H. L. Chou;C. C. Chen

  • Affiliations:
  • -;-

  • Venue:
  • PCM '02 Proceedings of the Third IEEE Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Texture mapping techniques are widely used in photo-realistic 3D model rendering, but different lighting and viewing parameters create a difference in intensity for neighboring images, so that an edge appears at the boundary where neighboring images are stitched together. We propose an automatic procedure to extract and blend textures. Firstly, textures are extracted from the images and mapped to the triangles of the model. We choose the one with largest resolution if multiple textures extracted from different images are mapped to the same triangle. Secondly, a texture blending procedure is applied. We normalize the images to user-specified base images through overlapping area. Textures are then adjusted to the corresponding textures in the base images, if they exist. Finally, we check the boundary of neighboring textures. Boundary pixels varying discontinuously are averaged, and interior pixels are reassigned colors. Experimental results show that the smooth transition between neighboring textures provide better visual quality than just blending the boundary where neighboring images stitched.