Estimating cloth simulation parameters from video
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
Data-driven elastic models for cloth: modeling and measurement
ACM SIGGRAPH 2011 papers
Data-Driven Estimation of Cloth Simulation Models
Computer Graphics Forum
Data-driven simulation methods in computer graphics: cloth, tissue and faces
ACM SIGGRAPH 2012 Courses
Modeling friction and air effects between cloth and deformable bodies
ACM Transactions on Graphics (TOG) - SIGGRAPH 2013 Conference Proceedings
Hi-index | 0.00 |
Realistic drape and motion of virtual clothing is now possible by using an up-to-date cloth simulator, but it is even difficult and time consuming to adjust and tune many parameters to achieve an authentic looking of a real particular fabric. Bhat et al. [2003] proposed a way to estimate the parameters from the video data of real fabrics. However, this projects structured light patterns on the fabrics, so it might not be possible to estimate the accurate value of the parameters if fabrics have colors and textures. In addition to the structured light patterns, they use a motion capture system to track how the fabrics move. In this paper, we will introduce a new method using only a motion capture system by attaching a few markers on fabric surface without any other devices. Moreover, animators can easily estimate the parameters of many kinds of fabrics with this method. Authentic looking and motion of simulated fabrics are realized by minimizing error function between captured motion data and synthetic motion considering both static and dynamic cloth features.