Human video textures

  • Authors:
  • Matthew Flagg;Atsushi Nakazawa;Qiushuang Zhang;Sing Bing Kang;Young Kee Ryu;Irfan Essa;James M. Rehg

  • Affiliations:
  • Georgia Institute of Technology;Osaka University and Georgia Institute of Technology;Google, Inc. and Georgia Institute of Technology;Microsoft Research;Sun Moon University;Georgia Institute of Technology;Georgia Institute of Technology

  • Venue:
  • Proceedings of the 2009 symposium on Interactive 3D graphics and games
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a data-driven approach for generating photorealistic animations of human motion. Each animation sequence follows a user-choreographed path and plays continuously by seamlessly transitioning between different segments of the captured data. To produce these animations, we capitalize on the complementary characteristics of motion capture data and video. We customize our capture system to record motion capture data that are synchronized with our video source. Candidate transition points in video clips are identified using a new similarity metric based on 3-D marker trajectories and their 2-D projections into video. Once the transitions have been identified, a video-based motion graph is constructed. We further exploit hybrid motion and video data to ensure that the transitions are seamless when generating animations. Motion capture marker projections serve as control points for segmentation of layers and nonrigid transformation of regions. This allows warping and blending to generate seamless in-between frames for animation. We show a series of choreographed animations of walks and martial arts scenes as validation of our approach.