Augmenting looking, pointing and reaching gestures to enhance the searching and browsing of physical objects

  • Authors:
  • David Merrill;Pattie Maes

  • Affiliations:
  • MIT Media Lab, Cambridge, MA;MIT Media Lab, Cambridge, MA

  • Venue:
  • PERVASIVE'07 Proceedings of the 5th international conference on Pervasive computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a framework for attaching information to physical objects in a way that can be interactively browsed and searched in a hands-free, multi-modal, and personalized manner that leverages users' natural looking, pointing and reaching behaviors. The system uses small infrared transponders on objects in the environment and worn by the user to achieve dense, on-object visual feedback usually possible only in augmented reality systems, while improving on interaction style and requirements for wearable gear. We discuss two applications that have been implemented, a tutorial about the parts of an automobile engine and a personalized supermarket assistant. The paper continues with a user study investigating browsing and searching behaviors in the supermarket scenario, and concludes with a discussion of findings and future work.