Viewpoint motion control by body position in immersive projection display
Proceedings of the 2002 ACM symposium on Applied computing
Speech and gesture multimodal control of a whole Earth 3D visualization environment
VISSYM '02 Proceedings of the symposium on Data Visualisation 2002
Vision-based hand pose estimation: A review
Computer Vision and Image Understanding
Task-based evaluation of skin detection for communication and perceptual interfaces
Journal of Visual Communication and Image Representation
3D posture representation using meshless parameterization with cylindrical virtual boundary
PSIVT'07 Proceedings of the 2nd Pacific Rim conference on Advances in image and video technology
FireVolleyball: multi-player interactive game providing a sense of touching fire
Proceedings of the international conference on Multimedia
Multi-point interactions with immersive omnidirectional visualizations in a dome
ACM International Conference on Interactive Tabletops and Surfaces
A real-time bimanual 3D interaction method based on bare-hand tracking
MM '11 Proceedings of the 19th ACM international conference on Multimedia
6DMG: a new 6D motion gesture database
Proceedings of the 3rd Multimedia Systems Conference
Dimension reduction in 3d gesture recognition using meshless parameterization
PSIVT'06 Proceedings of the First Pacific Rim conference on Advances in Image and Video Technology
A handle bar metaphor for virtual object manipulation with mid-air interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Gestural Recognition Interface for Intelligent Wheelchair Users
International Journal of Sociotechnology and Knowledge Development
Hand shape classification using depth data for unconstrained 3D interaction
Journal of Ambient Intelligence and Smart Environments - Ambient and Smart Component Technologies for Human Centric Computing
Hi-index | 0.00 |
In this paper, we introduce a method for tracking a user's hand in 3D and recognizing the hand's gesture in real-time without the use of any invasive devices attached to the hand. Our method uses multiple cameras for determining the position and orientation of a user's hand moving freely in a 3D space. In addition,the method identifies predetermined gestures in a fast and robust manner by using a neural network which has been properly trained beforehand. This paper also describes results of user study of our proposed method and its application for several types of applications, including 3D object handling for a desktop system and 3D walk-through for a large immersive display system.