Video-based hand manipulation capture through composite motion control

  • Authors:
  • Yangang Wang;Jianyuan Min;Jianjie Zhang;Yebin Liu;Feng Xu;Qionghai Dai;Jinxiang Chai

  • Affiliations:
  • Tsinghua University;Texas A&M University;Texas A&M University;Tsinghua University;Tsinghua University;Tsinghua University;Texas A&M University

  • Venue:
  • ACM Transactions on Graphics (TOG) - SIGGRAPH 2013 Conference Proceedings
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a new method for acquiring physically realistic hand manipulation data from multiple video streams. The key idea of our approach is to introduce a composite motion control to simultaneously model hand articulation, object movement, and subtle interaction between the hand and object. We formulate video-based hand manipulation capture in an optimization framework by maximizing the consistency between the simulated motion and the observed image data. We search an optimal motion control that drives the simulation to best match the observed image data. We demonstrate the effectiveness of our approach by capturing a wide range of high-fidelity dexterous manipulation data. We show the power of our recovered motion controllers by adapting the captured motion data to new objects with different properties. The system achieves superior performance against alternative methods such as marker-based motion capture and kinematic hand motion tracking.