Sensetable: a wireless object tracking platform for tangible user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Real-Time Fingertip Tracking and Gesture Recognition
IEEE Computer Graphics and Applications
Detecting, Tracking, and Interpretation of a Pointing Gesture by an Overhead View Camera
Proceedings of the 23rd DAGM-Symposium on Pattern Recognition
Placing Arbitrary Objects in a Real Scene Using a Color Cube for Pose Estimation
Proceedings of the 23rd DAGM-Symposium on Pattern Recognition
Optical Tracking Using Projective Invariant Marker Pattern Properties
VR '03 Proceedings of the IEEE Virtual Reality 2003
ISVC '08 Proceedings of the 4th International Symposium on Advances in Visual Computing
Hi-index | 0.00 |
This paper presents a computer vision based approach for creating 3D tangible interfaces, which can facilitate real–time and flexible interactions with the augmented virtual world. This approach uses real–world objects and free–hand gestures as interaction handles. The identity of these objects/gestures as well as their 3D pose in the physical world can be tracked in real–time. Once the objects and gestures are perceived and localized, the corresponding virtual objects can be manipulated dynamically by human operators who are operating on those real objects. Since the tracking algorithm is robust against background clutter and adaptable to illumination changes, it performs well in real–world scenarios, where both objects and cameras move rapidly in unconstrained environments.