Physically interactive tabletop augmented reality using the Kinect

  • Authors:
  • Sam Corbett-Davies;Richard Green;Adrian Clark

  • Affiliations:
  • University of Canterbury, New Zealand;University of Canterbury, New Zealand;Lab NZ, New Zealand

  • Venue:
  • Proceedings of the 27th Conference on Image and Vision Computing New Zealand
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a method for allowing arbitrary objects to interact physically in an augmented reality (AR) environment. A Microsoft Kinect is used to track objects in 6 degrees of freedom, enabling realistic interaction between them and virtual content in an tabletop AR context. We propose a point cloud based method for achieving such interaction. An adaptive per-pixel depth threshold is used to extract foreground objects, which are grouped using connected-component analysis. Objects are tracked with a variant of the Iterative Closest Point algorithm, which uses randomised projective correspondences. Our algorithm tracks objects moving at typical tabletop speeds with median drifts of 8.5% (rotational) and 4.8% (translational). The point cloud representation of foreground objects is improved as additional views of the object are visible to the Kinect. Physics-based AR interaction is achieved by fitting a collection of spheres to the point cloud model and passing them to the Bullet physics engine as a physics proxy of the object. Our method is demonstrated in an AR application where the user can interact with a virtual tennis ball, illustrating our proposed method's potential for physics-based AR interaction.