SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
Unstructured lumigraph rendering
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Accelerated volume ray-casting using texture mapping
Proceedings of the conference on Visualization '01
Efficient Free Form Light Field Rendering
VMV '01 Proceedings of the Vision Modeling and Visualization Conference 2001
Smart hardware-accelerated volume rendering
VISSYM '03 Proceedings of the symposium on Data visualisation 2003
Efficient Image-based Rendering of Volume Data
PG '98 Proceedings of the 6th Pacific Conference on Computer Graphics and Applications
DIRECT VOLUME RENDERING VIA 3D TEXTURES
DIRECT VOLUME RENDERING VIA 3D TEXTURES
Acceleration Techniques for GPU-based Volume Rendering
Proceedings of the 14th IEEE Visualization 2003 (VIS'03)
Real-time Volume Graphics
GPU-Based Monte-Carlo Volume Raycasting
PG '07 Proceedings of the 15th Pacific Conference on Computer Graphics and Applications
Sorting networks and their applications
AFIPS '68 (Spring) Proceedings of the April 30--May 2, 1968, spring joint computer conference
Hi-index | 0.00 |
The paper describes a technique to generate high-quality light field representations from volumetric data. We show how light field galleries can be created to give unexperienced audiences access to interactive high-quality volume renditions. The proposed light field representation is lightweight with respect to storage and bandwidth capacity and is thus ideal as exchange format for visualization results, especially for web galleries. The approach expands an existing sphere-hemisphere parameterization for the light field with per-pixel depth. High-quality paraboloid maps from volumetric data are generated using GPU-based ray-casting or slicing approaches. Different layers, such as isosurfaces, but not restricted to, can be generated independently and composited in real time. This allows the user to interactively explore the model and to change visibility parameters at run-time.