Rendering fur with three dimensional textures
SIGGRAPH '89 Proceedings of the 16th annual conference on Computer graphics and interactive techniques
Radiosity and realistic image synthesis
Radiosity and realistic image synthesis
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
OpenGL programming guide (2nd ed.): the official guide to learning OpenGL version 1.1.
OpenGL programming guide (2nd ed.): the official guide to learning OpenGL version 1.1.
Proceedings of the 1997 symposium on Interactive 3D graphics
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
I3D '99 Proceedings of the 1999 symposium on Interactive 3D graphics
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
A hybrid method of image synthesis in IBR for novel viewpoints
VRST '00 Proceedings of the ACM symposium on Virtual reality software and technology
Journal of Computer Science and Technology
View-dependent displacement mapping
ACM SIGGRAPH 2003 Papers
An Image-Based Approach to Three-Dimensional Computer Graphics
An Image-Based Approach to Three-Dimensional Computer Graphics
Stick textures for image-based rendering
Graphical Models - Special issue on SPM 05
Hi-index | 0.00 |
An extension to texture mapping is given in this paper for improving the efficiency of image-based rendering. For a depth image with an orthogonal displacement at each pixel, it is decomposed by the displacement into a series of layered textures (LTs) with each one having the same displacement for all its texels. Meanwhile, some texels of the layered textures are interpolated for obtaining a continuous 3D approximation of the model represented in the depth image. Thus, the plane-to-plane texture mapping can be used to map these layered textures to produce novel views and the advantages can be obtained as follows: accelerating the rendering speed, supporting the 3D surface details and view motion parallax, and avoiding the expensive task of hole-filling in the rendering stage. Experimental results show the new method can produce high-quality images and run faster than many famous image-based rendering techniques.