Texture for volume character animation

  • Authors:
  • Peiyi Shen;Philip Willis

  • Affiliations:
  • University of Bath, Bath, UK;University of Bath, Bath, UK

  • Venue:
  • GRAPHITE '05 Proceedings of the 3rd international conference on Computer graphics and interactive techniques in Australasia and South East Asia
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an approach to texture mapping volume datasets based on multiple semantic constraints. The approach is based on continuous space mappings to ensure good image quality. It requires only one intervention by the user, to determine key points where the texture must match an intermediate image of the original data. This can also be used to avoid the problem of texture being smeared over too large an area. The method uses three passes. In the first pass the texture is warped to an intermediate surface which is itself a projection of the volume object. This is used to ensure that the texture correctly aligns with user-supplied key points of the object. The use of key points to pin the texture in place ensures that the volume object can be animated. When rendering the object two further passes are used, first inversely mapping each point being rendered to a plenoptic surface and then inversely mapping to the intermediate surface. This three-part aspect additionally allows the texture image to be independent of the volume data. A data-dependent triangulation method is used to retain edge quality in texture images. We demonstrate an extension to 2.5D textures, extruded through the volume, using an approach consistent with 2D texture. This supports interactive sculpting of the volume model. Our overall goal is animated characters using textured volume data.