Bimanual Interaction with Interscopic Multi-Touch Surfaces

  • Authors:
  • Johannes Schöning;Frank Steinicke;Antonio Krüger;Klaus Hinrichs;Dimitar Valkov

  • Affiliations:
  • Intelligent User Interfaces Department, DFKI (German Research Center for Artifical Intelligence), Saarbrücken, Germany 66123;Visualization and Computer Graphics Group, Department of Computer Science, University of Münster, Münster, Germany 48149;Intelligent User Interfaces Department, DFKI (German Research Center for Artifical Intelligence), Saarbrücken, Germany 66123;Visualization and Computer Graphics Group, Department of Computer Science, University of Münster, Münster, Germany 48149;Visualization and Computer Graphics Group, Department of Computer Science, University of Münster, Münster, Germany 48149

  • Venue:
  • INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-touch interaction has received considerable attention in the last few years, in particular for natural two-dimensional (2D) interaction. However, many application areas deal with three-dimensional (3D) data and require intuitive 3D interaction techniques therefore. Indeed, virtual reality (VR) systems provide sophisticated 3D user interface, but then lack efficient 2D interaction, and are therefore rarely adopted by ordinary users or even by experts. Since multi-touch interfaces represent a good trade-off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the foundation of the next generation user interface for 2D as well as 3D interaction. In particular, stereoscopic display of 3D data provides an additional depth cue, but until now the challenges and limitations for multi-touch interaction in this context have not been considered. In this paper we present new multi-touch paradigms and interactions that combine both traditional 2D interaction and novel 3D interaction on a touch surface to form a new class of multi-touch systems, which we refer to as interscopic multi-touch surfaces (iMUTS). We discuss iMUTS-based user interfaces that support interaction with 2D content displayed in monoscopic mode and 3D content usually displayed stereoscopically. In order to underline the potential of the proposed iMUTS setup, we have developed and evaluated two example interaction metaphors for different domains. First, we present intuitive navigation techniques for virtual 3D city models, and then we describe a natural metaphor for deforming volumetric datasets in a medical context.