ShoeSense: a new perspective on gestural interaction and wearable applications

  • Authors:
  • Gilles Bailly;Jörg Müller;Michael Rohs;Daniel Wigdor;Sven Kratz

  • Affiliations:
  • Quality and Usability Lab, Telekom Innovation Laboratories, TU Berlin, Berlin, Germany;Quality and Usability Lab, Telekom Innovation Laboratories, TU Berlin, Berlin, Berlin, Germany;University of Munich, Munich, Germany;University of Toronto, Toronto, Ontario, Canada;University of Munich, Munich, Germany

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

When the user is engaged with a real-world task it can be inappropriate or difficult to use a smartphone. To address this concern, we developed ShoeSense, a wearable system consisting in part of a shoe-mounted depth sensor pointing upward at the wearer. ShoeSense recognizes relaxed and discreet as well as large and demonstrative hand gestures. In particular, we designed three gesture sets (Triangle, Radial, and Finger-Count) for this setup, which can be performed without visual attention. The advantages of ShoeSense are illustrated in five scenarios: (1) quickly performing frequent operations without reaching for the phone, (2) discreetly performing operations without disturbing others, (3) enhancing operations on mobile devices, (4) supporting accessibility, and (5) artistic performances. We present a proof-of-concept, wearable implementation based on a depth camera and report on a lab study comparing social acceptability, physical and mental demand, and user preference. A second study demonstrates a 94-99% recognition rate of our recognizers.