Surround-screen projection-based virtual reality: the design and implementation of the CAVE
SIGGRAPH '93 Proceedings of the 20th annual conference on Computer graphics and interactive techniques
Computer Vision and Image Understanding
Evaluation of a collaborative volume rendering application in a distributed virtual environment
EGVE '02 Proceedings of the workshop on Virtual environments 2002
Proceedings of the conference on Visualization '01
Multidimensional Transfer Functions for Interactive Volume Rendering
IEEE Transactions on Visualization and Computer Graphics
Volume rendering in a virtual environment
EGVE'01 Proceedings of the 7th Eurographics conference on Virtual Environments & 5th Immersive Projection Technology
High-Level User Interfaces for Transfer Function Design with Semantics
IEEE Transactions on Visualization and Computer Graphics
Survey of parallel and distributed volume rendering: revisited
ICCSA'05 Proceedings of the 2005 international conference on Computational Science and Its Applications - Volume Part III
Information-based transfer functions for multimodal visualization
EG VCBM'08 Proceedings of the First Eurographics conference on Visual Computing for Biomedicine
High-quality multimodal volume visualization of intracerebral pathological tissue
EG VCBM'08 Proceedings of the First Eurographics conference on Visual Computing for Biomedicine
Hi-index | 0.00 |
This paper reports on a new approach for visualizing multi-field MRI or CT datasets in an immersive environment with medical applications. Multi-field datasets combine multiple scanning modalities into a single 3D, multivalued, dataset. In our approach, they are classified and rendered using real-time hardware accelerated volume rendering, and displayed in a hybrid work environment, consisting of a dual power wall and a desktop PC. For practical reasons in this environment, the design and use of the transfer functions is subdivided into two steps, classification and exploration. The classification step is done at the desktop, taking advantage of the 2D mouse as a high accuracy input device. The exploration process takes place on the powerwall. We present our new approach, describe the underlying implementation issues, report on our experiences with different immersive environments, and suggest ways it can be used for collaborative medical diagnosis and treatment planning.