User experiences with activity-based navigation on mobile devices

  • Authors:
  • A.J. Bernheim Brush;Amy K. Karlson;James Scott;Raman Sarin;Andy Jacobs;Barry Bond;Oscar Murillo;Galen Hunt;Mike Sinclair;Kerry Hammil;Steven Levi

  • Affiliations:
  • Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA;Microsoft Research, Redmond, WA, USA

  • Venue:
  • Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce activity-based navigation, which uses human activities derived from sensor data to help people navigate, in particular to retrace a "trail" previously taken by that person or another person. Such trails may include step counts, walking up/down stairs or taking elevators, compass directions, and photos taken along a user's path, in addition to absolute positioning (GPS and maps) when available. To explore the user experience of activity-based navigation, we built Greenfield, a mobile device interface for finding a car. We conducted a ten participant user study comparing users' ability to find cars across three different presentations of activity-based information as well as verbal instructions. Our results show that activity-based navigation can be used for car finding and suggest its promise more generally for supporting navigation tasks. We present lessons for future activity-based navigation interfaces, and motivate further work in this space, particularly in the area of robust activity inference.