Looking at you: fused gyro and face tracking for viewing large imagery on mobile devices

  • Authors:
  • Neel Joshi;Abhishek Kar;Michael Cohen

  • Affiliations:
  • Microsoft Research, Redmond, Washington, United States;IIT Kanpur, Kanpur, Uttar Pradesh, India & Microsoft Research, Redmond, Washington, United States;Microsoft Research, Redmond, Washington, United States

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

We present a touch-free interface for viewing large imagery on mobile devices. In particular, we focus on viewing paradigms for 360 degree panoramas, parallax image sequences, and long multi-perspective panoramas. We describe a sensor fusion methodology that combines face tracking using a front-facing camera with gyroscope data to produce a robust signal that defines the viewer's 3D position relative to the display. The gyroscopic data provides both low-latency feedback and allows extrapolation of the face position beyond the the field-of-view of the front-facing camera. We also demonstrate a hybrid position and rate control that uses the viewer's 3D position to drive exploration of very large image spaces. We report on the efficacy of the hybrid control vs. position only control through a user study.