Skin Texture Modeling

  • Authors:
  • Oana G. Cula;Kristin J. Dana;Frank P. Murphy;Babar K. Rao

  • Affiliations:
  • Department of Computer Science, Department of Electrical and Computer Engineering, Rutgers University, Piscataway, USA;Department of Computer Science, Department of Electrical and Computer Engineering, Rutgers University, Piscataway, USA;Department of Dermatology, University of Medicine and Dentistry of New Jersey, New Brunswick, USA;Department of Dermatology, University of Medicine and Dentistry of New Jersey, New Brunswick, USA

  • Venue:
  • International Journal of Computer Vision - Special Issue on Texture Analysis and Synthesis
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Quantitative characterization of skin appearance is an important but difficult task. The skin surface is a detailed landscape, with complex geometry and local optical properties. In addition, skin features depend on many variables such as body location (e.g. forehead, cheek), subject parameters (age, gender) and imaging parameters (lighting, camera). As with many real world surfaces, skin appearance is strongly affected by the direction from which it is viewed and illuminated. Computational modeling of skin texture has potential uses in many applications including realistic rendering for computer graphics, robust face models for computer vision, computer-assisted diagnosis for dermatology, topical drug efficacy testing for the pharmaceutical industry and quantitative comparison for consumer products. In this work we present models and measurements of skin texture with an emphasis on faces. We develop two models for use in skin texture recognition. Both models are image-based representations of skin appearance that are suitably descriptive without the need for prohibitively complex physics-based skin models. Our models take into account the varied appearance of the skin with changes in illumination and viewing direction. We also present a new face texture database comprised of more than 2400 images corresponding to 20 human faces, 4 locations on each face (forehead, cheek, chin and nose) and 32 combinations of imaging angles. The complete database is made publicly available for further research.