DRAPE: DRessing Any PErson

  • Authors:
  • Peng Guan;Loretta Reiss;David A. Hirshberg;Alexander Weiss;Michael J. Black

  • Affiliations:
  • Brown University, Providence;Brown University, Providence;Max Planck Institute for Intelligent Systems, Tübingen;Brown University, Providence;Brown University, Providence and Max Planck Institute for Intelligent Systems, Tübingen

  • Venue:
  • ACM Transactions on Graphics (TOG) - SIGGRAPH 2012 Conference Proceedings
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a complete system for animating realistic clothing on synthetic bodies of any shape and pose without manual intervention. The key component of the method is a model of clothing called DRAPE (DRessing Any PErson) that is learned from a physics-based simulation of clothing on bodies of different shapes and poses. The DRAPE model has the desirable property of "factoring" clothing deformations due to body shape from those due to pose variation. This factorization provides an approximation to the physical clothing deformation and greatly simplifies clothing synthesis. Given a parameterized model of the human body with known shape and pose parameters, we describe an algorithm that dresses the body with a garment that is customized to fit and possesses realistic wrinkles. DRAPE can be used to dress static bodies or animated sequences with a learned model of the cloth dynamics. Since the method is fully automated, it is appropriate for dressing large numbers of virtual characters of varying shape. The method is significantly more efficient than physical simulation.