3D spatial touch system based on time-of-flight camera

  • Authors:
  • Yang-Keun Ahn;Young-Choong Park;Kwang-Soon Choi;Woo-Chool Park;Hae-Moon Seo;Kwang-Mo Jung

  • Affiliations:
  • Korea Electronics Technology Institute, Mapo-gu, Seoul, Korea;Korea Electronics Technology Institute, Mapo-gu, Seoul, Korea;Korea Electronics Technology Institute, Mapo-gu, Seoul, Korea;Korea Electronics Technology Institute, Mapo-gu, Seoul, Korea;Korea Electronics Technology Institute, Mapo-gu, Seoul, Korea;Korea Electronics Technology Institute, Mapo-gu, Seoul, Korea

  • Venue:
  • WSEAS Transactions on Information Science and Applications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently developed Time-of-flight principle based depth-sensing video camera technologies provide precise per-pixel range data in addition to color video. Such cameras will find application in robotics and vision-based human computer interaction scenarios such as games and gesture input systems. Time-of-flight principle range cameras are becoming more and more available. They promise to make the 3D reconstruction of scenes easier, avoiding the practical issues resulting from 3D imaging techniques based on triangulation or disparity estimation. We present a 3D interactive interface system which uses a depth-sensing camera to touch spatial objects and details on its implementation. We speculate on how this technology will enable new 3D interactions. This study implements a virtual touch screen that keeps track of the location of hand, inputted from the disparity image being outputted by a time of flight (TOF) camera, using the Kalman filter. Put out from the depth image of the TOF camera, this image is insensitive to light and therefore helps implement a virtual touch screen independent from the surroundings. The biggest problem with conventional virtual touch screens has been that even the slightest change in location led an object to fall out of or enter the virtual touch screen. In other words, the pointing location responded too sensitively to accurately detect the touch point. The Kalman filter, on the other hand, can be a solution to this problem, as it constantly predicts the pointing location and detects this touch point, without interruption, in response to even a slight change in location. This enables a stable and smooth change in the location of pointing point on the virtual touch screen.