Motion Generator Approach to Translating Human Motion from Video to Animation

  • Authors:
  • Tsukasa Noma;Kyoji Oishi;Hiroshi Futsuhara;Hiromi Baba;Takeshi Ohashi;Toshiaki Ejima

  • Affiliations:
  • -;-;-;-;-;-

  • Venue:
  • PG '99 Proceedings of the 7th Pacific Conference on Computer Graphics and Applications
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a motion generator approach to translating human motion from video image sequences to computer animations in real-time. In the motion generator approach, a motion generator makes inferences on the current human motion and/or posture from the data obtained by processing the source video images, and then generates and sends a set of joint angles to the target human body model. Compared with the existing motion capture approach, our approach is more robust, and tolerant of broader environmental and postural conditions. Experiments on a prototype system show that an animated virtual human can walk, sit, and lie as the real human performs without special illuminations control.