VRCodes: Unobtrusive and active visual codes for interaction by exploiting rolling shutter

  • Authors:
  • Grace Woo;Andy Lippman;Ramesh Raskar

  • Affiliations:
  • MIT Media Lab, USA;MIT Media Lab, USA;MIT Media Lab, USA

  • Venue:
  • ISMAR '12 Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We show a new visible tagging solution for active displays which allows a rolling-shutter camera to detect active tags from a relatively large distance in a robust manner. Current planar markers are visually obtrusive for the human viewer. In order for them to be read from afar and embed more information, they must be shown larger thus occupying valuable physical space on the design. We present a new active visual tag which utilizes all dimensions of color, time and space while remaining unobtrusive to the human eye and decodable using a 15fps rolling-shutter camera. The design exploits the flicker fusion-frequency threshold of the human visual system, which due to the effect of metamerism, can not resolve metamer pairs alternating beyond 120Hz. Yet, concurrently, it is decodable using a 15fps rolling-shutter camera due to the effective line-scan speed of 15脳400 lines per second. We show an off-the-shelf rolling-shutter camera can resolve the metamers flickering on a television from a distance over 4 meters. We use intelligent binary coding to encode digital positioning and show potential applications such as large screen interaction. We analyze the use of codes for locking and tracking encoded targets. We also analyze the constraints and performance of the sampling system, and discuss several plausible application scenarios.