Video Input Driven Animation (VIDA)

  • Authors:
  • Meng Sun;Allan D. Jepson;Eugene Fiume

  • Affiliations:
  • -;-;-

  • Venue:
  • ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

There are many challenges associated with the integration ofsynthetic and real imagery. One particularly difficult problem isthe automatic extraction of salient parameters of natural phenomenain real video footage for subsequent application to syntheticobjects. Can we ensure that the hair and clothing of a syntheticactor placed in a meadow of swaying grass will move consistentlywith the wind that moved that grass? The video footage can be seenas a controller for the motion of synthetic features, a concept wecall video input driven animation (VIDA). We propose a schema thatanalyzes an input video sequence, extracts parameters from themotion of objects in the video, and uses this information to drivethe motion of synthetic objects. To validate the principles ofVIDA, we approximate the inverse problem to harmonic oscillation,which we use to extract parameters of wind and of regular waterwaves. We observe the effect of wind on a tree in a video, estimatewind speed parameters from its motion, and then use this to makesynthetic objects move. We also extract water elevation parametersfrom the observed motion of boats and apply the resulting waterwaves to synthetic boats.