Interactive virtual try-on based on real-time motion capture

  • Authors:
  • Xiaoyang Zhu;Shuxin Qin;Haitao Yu;Shuiying Ge;Yiping Yang;Yongshi Jiang

  • Affiliations:
  • Integrated Information System Research Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China;Integrated Information System Research Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China;Integrated Information System Research Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China;Integrated Information System Research Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China;Integrated Information System Research Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China;Integrated Information System Research Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China

  • Venue:
  • PCM'12 Proceedings of the 13th Pacific-Rim conference on Advances in Multimedia Information Processing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present an augmented reality system for interactive virtual try-on of garments based on real-time motion capture. The system uses a commodity depth sensor to obtain depth images and joint motion data of a user. We first apply a novel evaluation method over two specific depth images to determine the user's body measurements, according to which garment models are deformed to fit his/her actual size. Torso joint of the selected garment model is translated to follow the user's position, while other joints are rotated to mimic the user's postures. Then the system superimposes the transformed model onto the same user in a live image sequence and move with him/her accurately. We also propose a method for handling model dangling artifacts due to the limitations of current tracking techniques in producing smooth, noise free motion data in real-time. The system can be used at user's home as well as retailer's shop to present an interactive try-on experience.