MirageTable: freehand interaction on a projected augmented reality tabletop

  • Authors:
  • Hrvoje Benko;Ricardo Jota;Andrew Wilson

  • Affiliations:
  • Microsoft Research, Redmond, Washington, United States;Technical University of Lisbon, Lisbon, Portugal & Microsoft Research, Redmond, Washington, United States;Microsoft Research, Redmond, Washington, United States

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Instrumented with a single depth camera, a stereoscopic projector, and a curved screen, MirageTable is an interactive system designed to merge real and virtual worlds into a single spatially registered experience on top of a table. Our depth camera tracks the user's eyes and performs a real-time capture of both the shape and the appearance of any object placed in front of the camera (including user's body and hands). This real-time capture enables perspective stereoscopic 3D visualizations to a single user that account for deformations caused by physical objects on the table. In addition, the user can interact with virtual objects through physically-realistic freehand actions without any gloves, trackers, or instruments. We illustrate these unique capabilities through three application examples: virtual 3D model creation, interactive gaming with real and virtual objects, and a 3D teleconferencing experience that not only presents a 3D view of a remote person, but also a seamless 3D shared task space. We also evaluated the user's perception of projected 3D objects in our system, which confirmed that the users can correctly perceive such objects even when they are projected over different background colors and geometries (e.g., gaps, drops).