Detecting interaction above digital tabletops using a single depth camera

  • Authors:
  • Nadia Haubner;Ulrich Schwanecke;Ralf Dörner;Simon Lehmann;Johannes Luderschmidt

  • Affiliations:
  • RheinMain University of Applied Sciences, Wiesbaden, Germany 65195;RheinMain University of Applied Sciences, Wiesbaden, Germany 65195;RheinMain University of Applied Sciences, Wiesbaden, Germany 65195;RheinMain University of Applied Sciences, Wiesbaden, Germany 65195;RheinMain University of Applied Sciences, Wiesbaden, Germany 65195

  • Venue:
  • Machine Vision and Applications
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Digital tabletop environments offer a huge potential to realize application scenarios where multiple users interact simultaneously or aim to solve collaborative tasks. So far, research in this field focuses on touch and tangible interaction, which only takes place on the tabletop's surface. First approaches aim at involving the space above the surface, e.g., by employing freehand gestures. However, these are either limited to specific scenarios or employ obtrusive tracking solutions. In this paper, we propose an approach to unobtrusively segment and detect interaction above a digital surface using a depth sensing camera. To achieve this, we adapt a previously presented approach that segments arms in depth data from a front-view to a top-view setup facilitating the detection of hand positions. Moreover, we propose a novel algorithm to merge segments and give a comparison to the original segmentation algorithm. Since the algorithm involves a large number of parameters, estimating the optimal configuration is necessary. To accomplish this, we describe a low effort approach to estimate the parameter configuration based on simulated annealing. An evaluation of our system to detect hands shows that a repositioning precision of approximately 1 cm is achieved. This accuracy is sufficient to reliably realize interaction metaphors above a surface.