Avatar motion control by user body postures

  • Authors:
  • Satoshi Yonemoto;Hiroshi Nakano;Rin-ichiro Taniguchi

  • Affiliations:
  • Kyushu Sangyo University, Matsukadai Fukuoka, Japan;Kyushu University, Kasuga-koen, Fukuoka, Japan;Kyushu University, Kasuga-koen, Fukuoka, Japan

  • Venue:
  • MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an avatar motion control by body postures. Our goal is to do seamless mapping of human motion in the real world into virtual environments. We hope that the idea of direct human motion sensing will be used on future interfaces. With the aim of making computing systems suited for users, we have developed a computer vision based avatar motion control. The human motion sensing is based on skin-color blob tracking. Our method can generate realistic avatar motion from the sensing data. We address our framework to use virtual scene context as a priori knowledge. We assume that virtual objects in virtual environments can afford avatar's action, that is, the virtual environments provide action information for the avatar. Avatar's motion is controlled, based on simulating the idea of affordance extended into the virtual environments.