Free-hand interaction for handheld augmented reality using an RGB-depth camera

  • Authors:
  • Huidong Bai;Lei Gao;Jihad El-Sana;Mark Billinghurst

  • Affiliations:
  • University of Canterbury, New Zealand;University of Canterbury, New Zealand;University of Canterbury, New Zealand and Ben-Gurion University of the Negev, Israel;University of Canterbury, New Zealand

  • Venue:
  • SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present a novel gesture-based interaction method for handheld Augmented Reality (AR) implemented on a tablet with an RGB-Depth camera attached. Compared with conventional device-centric interaction methods like keypad, stylus, or touchscreen input, natural gesture-based interfaces offer a more intuitive experience for AR applications. Combining with depth information, gesture interfaces can extend handheld AR interaction into full 3D space. In our system we retrieve the 3D hand skeleton from color and depth frames, mapping the results to corresponding manipulations of virtual objects in the AR scene. Our method allows users to control virtual objects in 3D space using their bare hands and perform operations such as translation, rotation, and zooming.