Connecting users to virtual worlds within MPEG-V standardization

  • Authors:
  • Seungju Han;Jae-Joon Han;James D. K. Kim;Changyeong Kim

  • Affiliations:
  • Advanced Media Lab, Samsung Advanced Institute of Technology, Yongin, Republic of Korea;Advanced Media Lab, Samsung Advanced Institute of Technology, Yongin, Republic of Korea;Advanced Media Lab, Samsung Advanced Institute of Technology, Yongin, Republic of Korea;Advanced Media Lab, Samsung Advanced Institute of Technology, Yongin, Republic of Korea

  • Venue:
  • Image Communication
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Virtual world such as Second life and 3D internet/broadcasting services have been increasingly popular. A life-scale virtual world presentation and the intuitive interaction between the users and the virtual worlds would provide more natural and immersive experience for users. The emergence of novel interaction technologies, such as facial-expression/body-motion tracking and remote interaction for virtual object manipulation, could be used to provide a strong connection between users in the real world and avatars in the virtual world. For the wide acceptance and the use of the virtual world, various types of novel interaction devices should have a unified interaction format between the real world and the virtual world. Thus, MPEG-V Media Context and Control (ISO/IEC 23005) standardizes such connecting information. The paper provides an overview and its usage example of MPEG-V from the real world to the virtual world (R2V) on interfaces for controlling avatars and virtual objects in the virtual world by the real world devices. In particular, we investigate how the MPEG-V framework can be applied for the facial animation and hand-based 3D manipulation using intelligent camera. In addition, in order to intuitively manipulate objects in a 3D virtual environment, we present two interaction techniques using motion sensors such as a two-handed spatial 3D interaction approach and a gesture-based interaction approach.