Tangible 3D: hand gesture interaction for immersive 3D modeling

  • Authors:
  • Hyosun Kim;Georgia Albuquerque;Sven Havemann;Dieter W. Fellner

  • Affiliations:
  • Institut für ComputerGraphik, TU Braunschweig, Germany and Institut für ComputerGraphik und WissensVisualisierung, TU Graz, Austria;Institut für ComputerGraphik, TU Braunschweig, Germany;Institut für ComputerGraphik, TU Braunschweig, Germany and Institut für ComputerGraphik und WissensVisualisierung, TU Graz, Austria;Institut für ComputerGraphik, TU Braunschweig, Germany and Institut für ComputerGraphik und WissensVisualisierung, TU Graz, Austria

  • Venue:
  • EGVE'05 Proceedings of the 11th Eurographics conference on Virtual Environments
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Most of all interaction tasks relevant for a general three-dimensional virtual environment can be supported by 6DOF control and grab/select input. Obviously a very efficient method is direct manipulation with bare hands, like in real environment. This paper shows the possibility to perform non-trivial tasks using only a few well-known hand gestures, so that almost no training is necessary to interact with 3D-softwares. Using this gesture interaction we have built an immersive 3D modeling system with 3D model representation based on a mesh library, which is optimized not only for real-time rendering but also accommodates for changes of both vertex positions and mesh connectivity in real-time. For performing the gesture interaction, the user's hand is marked with just four fingertipthimbles made of inexpensive material as simple as white paper. Within our scenario, the recognized hand gestures are used to select, create, manipulate and deform the meshes in a spontaneous and intuitive way. All modeling tasks are performed wirelessly through a camera/vision tracking method for the head and hand interaction.