Illuminating clay: a 3-D tangible interface for landscape analysis
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
TouchLight: an imaging touch screen and display for gesture-based interaction
Proceedings of the 6th international conference on Multimodal interfaces
A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 3 - Volume 03
PlayAnywhere: a compact interactive tabletop projection-vision system
Proceedings of the 18th annual ACM symposium on User interface software and technology
Robust computer vision-based detection of pinching for one and two-handed gesture input
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Evaluation of gesture based interfaces for medical volume visualization tasks
Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
A Scale Independent Selection Process for 3D Object Recognition in Cluttered Scenes
International Journal of Computer Vision
Hi-index | 0.00 |
Recently developed Time-of-flight principle based depth-sensing video camera technologies provide precise per-pixel range data in addition to color video. Such cameras will find application in robotics and vision-based human computer interaction scenarios such as games and gesture input systems. Time-of-flight principle range cameras are becoming more and more available. They promise to make the 3D reconstruction of scenes easier, avoiding the practical issues resulting from 3D imaging techniques based on triangulation or disparity estimation. We present a 3D interactive interface system which uses a depth-sensing camera to touch spatial objects and details on its implementation. We speculate on how this technology will enable new 3D interactions. This study implements a virtual touch screen that keeps track of the location of hand, inputted from the disparity image being outputted by a time of flight (TOF) camera, using the Kalman filter. Put out from the depth image of the TOF camera, this image is insensitive to light and therefore helps implement a virtual touch screen independent from the surroundings. The biggest problem with conventional virtual touch screens has been that even the slightest change in location led an object to fall out of or enter the virtual touch screen. In other words, the pointing location responded too sensitively to accurately detect the touch point. The Kalman filter, on the other hand, can be a solution to this problem, as it constantly predicts the pointing location and detects this touch point, without interruption, in response to even a slight change in location. This enables a stable and smooth change in the location of pointing point on the virtual touch screen.