Automatic perceptual color map generation for realistic volume visualization

  • Authors:
  • Jonathan C. Silverstein;Nigel M. Parsad;Victor Tsirline

  • Affiliations:
  • Departments of Surgery, Radiology, and Computation Institute, University of Chicago and Argonne National Laboratory, Research Institutes, Suite 405, 5640 South Ellis Avenue, Chicago, IL 60637, USA;Departments of Surgery, Radiology, and Computation Institute, University of Chicago and Argonne National Laboratory, Research Institutes, Suite 405, 5640 South Ellis Avenue, Chicago, IL 60637, USA;Departments of Surgery, Radiology, and Computation Institute, University of Chicago and Argonne National Laboratory, Research Institutes, Suite 405, 5640 South Ellis Avenue, Chicago, IL 60637, USA

  • Venue:
  • Journal of Biomedical Informatics
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Advances in computed tomography imaging technology and inexpensive high performance computer graphics hardware are making high-resolution, full color (24-bit) volume visualizations commonplace. However, many of the color maps used in volume rendering provide questionable value in knowledge representation and are non-perceptual thus biasing data analysis or even obscuring information. These drawbacks, coupled with our need for realistic anatomical volume rendering for teaching and surgical planning, has motivated us to explore the auto-generation of color maps that combine natural colorization with the perceptual discriminating capacity of grayscale. As evidenced by the examples shown that have been created by the algorithm described, the merging of perceptually accurate and realistically colorized virtual anatomy appears to insightfully interpret and impartially enhance volume rendered patient data.