Model-based solid texture synthesis for anatomic volume illustration

  • Authors:
  • Ilknur Kabul;Derek Merck;Julian Rosenman;Julian Rosenman

  • Affiliations:
  • Department of Computer Science, UNC Chapel Hill;Department of Computer Science, UNC Chapel Hill;Department of Radiation Oncology, UNC Chapel Hill;Department of Computer Science, UNC Chapel Hill

  • Venue:
  • EG VCBM'10 Proceedings of the 2nd Eurographics conference on Visual Computing for Biology and Medicine
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Medical illustrations can make powerful use of texture synthesis to convey information about anatomic structures in an attractive, effective and understandable way. Current visualization methods are not capable of conveying detailed information about the orientation, internal structure, and other local properties of the anatomical objects for a particular patient because imaging modalities such as CT or MRI do not capture this information. In this paper, a new anatomical rendering method that utilizes model-based synthesis of 3D textures is proposed in order to distinguish and illustrate different structures inside the model. The goal of our volume illustration approach is to visualize structural information by considering directions and layers in synthesizing high-quality, high-resolution solid textures. Our method uses medial coordinates of 3D models and 2D exemplar textures to generate solid textures that change progressively in orientation and material according to the local orientation and transition information implicit in the anatomic region. Discrete medial 3D anatomical models ("m-reps") provide the orientation field and texture variation maps inside image regions. In our paper, we demonstrate the robustness of our method with a variety of textures applied to different anatomical structures, such as muscles, and mandible.