Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Gesture VR: vision-based 3D hand interace for spatial interaction
MULTIMEDIA '98 Proceedings of the sixth ACM international conference on Multimedia
First Person Indoor/Outdoor Augmented Reality Application: ARQuake
Personal and Ubiquitous Computing
The MagicBookMoving Seamlessly between Reality and Virtuality
IEEE Computer Graphics and Applications
Robust classification of hand postures against complex backgrounds
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
DigitEyes: Vision-Based Human Hand Tracking
DigitEyes: Vision-Based Human Hand Tracking
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
Personal and Ubiquitous Computing
MRI: a mixed reality interface for the masses
ACM SIGGRAPH 2006 Emerging technologies
Vision-based projected tabletop interface for finger interactions
HCI'07 Proceedings of the 2007 IEEE international conference on Human-computer interaction
Physically interactive tabletop augmented reality using the Kinect
Proceedings of the 27th Conference on Image and Vision Computing New Zealand
Hi-index | 0.00 |
Mixed reality applications can provide users with enhanced interaction experiences by integrating virtual and real world objects in a mixed environment. Through the mixed reality interface, a more realistic and immersive control style is achieved compared to the traditional keyboard and mouse input devices. The interface proposed in this paper consists of a stereo camera, which tracks the user's hands and fingers robustly and accurately in the 3D space. To enable a physically realistic experience in the interaction, a physics engine is adopted for the simulating the physics of virtual object manipulation. The objects can be picked up and tossed with physical characteristics, such as gravity and collisions which occur in the real world. Detection and interaction in our system is fully computer-vision based, without any markers or additional sensors. We demonstrate this gesture-based interface using two mixed reality game implementations: finger fishing, in which a player can simulate fishing for virtual objects with his/her fingers as in a real environment, and Jenga, which is a simulation of the well-known tower building game. A user study is conducted and reported to demonstrate the accuracy, effectiveness and comfort of using this interactive interface.