Integrating intra and extra gestures into a mobile and multimodal shopping assistant

  • Authors:
  • Rainer Wasinger;Antonio Krüger;Oliver Jacobs

  • Affiliations:
  • DFKI GmbH, Intelligent User Interfaces Department, Saarbrücken, Germany;Institute for Geoinformatics, University of Münster, Münster, Germany;DFKI GmbH, Intelligent User Interfaces Department, Saarbrücken, Germany

  • Venue:
  • PERVASIVE'05 Proceedings of the Third international conference on Pervasive Computing
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.