TouchPosing: multi-modal interaction with geospatial data

  • Authors:
  • Florian Daiber;Sven Gehring;Markus Löchtefeld;Antonio Krüger

  • Affiliations:
  • German Research Institute for Artificial Intelligence (DFKI), Saarbrücken, Germany;German Research Institute for Artificial Intelligence (DFKI), Saarbrücken, Germany;German Research Institute for Artificial Intelligence (DFKI), Saarbrücken, Germany;German Research Institute for Artificial Intelligence (DFKI), Saarbrücken, Germany

  • Venue:
  • Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-touch interaction offers opportunities to interact with complex data. Especially the exploration of geographical data, which until today mostly relies on mice and keyboard input, could benefit from this interaction paradigm. However, the gestures that are required to interact with complex systems like Geographic Information Systems (GIS) increase in difficulty with every additional functionality. This paper describes a novel interaction approach that allows non-expert users to easily explore geographic data using a combination of multi-touch gestures and handpostures. The use of the additional input modality -- handpose -- is supposed to avoid more complex multi-touch gestures. Furthermore the screen of a wearable device serves as another output modality that on one hand avoids occlusion and on the other hand serves as a magic lens.