Performance capture with physical interaction

  • Authors:
  • Nam Nguyen;Nkenge Wheatland;David Brown;Brian Parise;C. Karen Liu;Victor Zordan

  • Affiliations:
  • University of California Riverside;University of California Riverside;University of California Riverside;Georgia Institute of Technology;Georgia Institute of Technology;University of California Riverside

  • Venue:
  • Proceedings of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces a technique for combining performance-based animation with a physical model in order to synthesize complex interactions in an animated scene. The approach is to previsualize interaction of final integrated scene, online, while the performance is being recorded. To accomplish this goal, we propose a framework which unifies kinematic playback of motion capture and dynamic motion synthesis. The proposed method augments a real-time recording of a human actor with dynamics-based response in order to modify motion data based on the conditions of the character. The system unifies kinematic and dynamic aspects of the final motion while allowing user control over the outcome both temporally and spatially across the character's body. Examples of complex interactions interleaved with intelligent response underscore the power of the technique along with multi-person captures in which remote users interact physically in a shared virtual world.