3DM: a three dimensional modeler using a head-mounted display
I3D '92 Proceedings of the 1992 symposium on Interactive 3D graphics
Two-handed polygonal surface design
UIST '94 Proceedings of the 7th annual ACM symposium on User interface software and technology
A geometric modeling and animation system for virtual reality
Communications of the ACM
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
3D object modeling using spatial and pictographic gestures
VRST '98 Proceedings of the ACM symposium on Virtual reality software and technology
Surface drawing: creating organic 3D shapes with the hand and tangible tools
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Dynamic real-time deformations using space & time adaptive sampling
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Construct3D: A Virtual Reality Application for Mathematics and Geometry Education
Education and Information Technologies
Tutorial 1: The Art and Science of 3D Interaction
VR '00 Proceedings of the IEEE Virtual Reality 2000 Conference
Shape modeling with point-sampled geometry
ACM SIGGRAPH 2003 Papers
Twister: a space-warp operator for the two-handed editing of 3D shapes
ACM SIGGRAPH 2003 Papers
Haptic Sculpting of Volumetric Implicit Functions
PG '01 Proceedings of the 9th Pacific Conference on Computer Graphics and Applications
Interaction with Hand Gesture for a Back-Projection Wall
CGI '04 Proceedings of the Computer Graphics International
EG VE'00 Proceedings of the 6th Eurographics conference on Virtual Environments
Automatic adjustments for efficient and precise positioning and release of virtual objects
Proceedings of the 2006 ACM international conference on Virtual reality continuum and its applications
IEICE - Transactions on Information and Systems
Hi-index | 0.00 |
Most of all interaction tasks relevant for a general three-dimensional virtual environment can be supported by 6DOF control and grab/select input. Obviously a very efficient method is direct manipulation with bare hands, like in real environment. This paper shows the possibility to perform non-trivial tasks using only a few well-known hand gestures, so that almost no training is necessary to interact with 3D-softwares. Using this gesture interaction we have built an immersive 3D modeling system with 3D model representation based on a mesh library, which is optimized not only for real-time rendering but also accommodates for changes of both vertex positions and mesh connectivity in real-time. For performing the gesture interaction, the user's hand is marked with just four fingertipthimbles made of inexpensive material as simple as white paper. Within our scenario, the recognized hand gestures are used to select, create, manipulate and deform the meshes in a spontaneous and intuitive way. All modeling tasks are performed wirelessly through a camera/vision tracking method for the head and hand interaction.