User-defined gestures for connecting mobile phones, public displays, and tabletops

  • Authors:
  • Christian Kray;Daniel Nesbitt;John Dawson;Michael Rohs

  • Affiliations:
  • Newcastle University, Newcastle upon Tyne, United Kingdom;Newcastle University, Newcastle upon Tyne, United Kingdom;Newcastle University, Newcastle upon Tyne, United Kingdom;TU Berlin, Berlin, Germany

  • Venue:
  • Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.