Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Pointing gesture recognition based on 3D-tracking of face, hands and head orientation
Proceedings of the 5th international conference on Multimodal interfaces
Modelling and estimating the pose of a human arm
Machine Vision and Applications - Special issue: Human modeling, analysis, and synthesis
Multimodal Sparse Features for Object Detection
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Recognition of deictic gestures for wearable computing
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Deixis: how to determine demonstrated objects using a pointing cone
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Self-Organizing Maps for Pose Estimation with a Time-of-Flight Camera
Dyn3D '09 Proceedings of the DAGM 2009 Workshop on Dynamic 3D Imaging
ProJest: enabling higher levels of collaboration using today's mobile devices
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: towards mobile and intelligent interaction environments - Volume Part III
Human-computer interaction through time-of-flight and RGB cameras
ICIAP'11 Proceedings of the 16th international conference on Image analysis and processing - Volume Part II
A multimedia presentation system using a 3D gesture interface in museums
Multimedia Tools and Applications
Hi-index | 0.00 |
We present a robust detector for deictic gestures based on a time-of-flight (TOF) camera, a combined range and intensity image sensor. Pointing direction is used to determine whether the gesture is intended for the system at all and to assign different meanings to the same gesture depending on pointing direction. We use the gestures to control a slideshow presentation: Making a “thumbs-up” gesture while pointing to the left or right of the screen switches to the previous or next slide. Pointing at the screen causes a “virtual laser pointer” to appear. Since the pointing direction is estimated in 3D, the user can move freely within the field of view of the camera after the system was calibrated. The pointing direction is measured with an absolute accuracy of 0.6 degrees and a measurement noise of 0.9 degrees near the center of the screen.