Using 3D Touch Interaction for a Multimodal Zoomable User Interface

  • Authors:
  • Florian Laquai;Markus Ablassmeier;Tony Poitschke;Gerhard Rigoll

  • Affiliations:
  • Institute for Human-Machine Communication, Technische Universität München, Munich, Germany 80333;Institute for Human-Machine Communication, Technische Universität München, Munich, Germany 80333;Institute for Human-Machine Communication, Technische Universität München, Munich, Germany 80333;Institute for Human-Machine Communication, Technische Universität München, Munich, Germany 80333

  • Venue:
  • Proceedings of the Symposium on Human Interface 2009 on ConferenceUniversal Access in Human-Computer Interaction. Part I: Held as Part of HCI International 2009
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Touchscreens are becoming the preferred input device in a growing number of applications. They are interesting devices which are more and more introduced into the automotive domain. Current implementations impose problems like precise pointing or high visual attention and therefore the capabilities of projected capacitive touchscreens are investigated. Unlike traditional sensing techniques like resistive or optical measurements, projected capacitive sensors register the presence of a hand or finger even before the actual touch. This enables such devices to be used not only for 2D input but for interaction using the position and distance of the users finger relatively to the screen. The additional distance information is then applied to control a Zoomable User Interface. In this study touch and speech interaction is being applied to allow fuzzy input and lower visual attention demand than conventional Touchscreens. A demonstrator was developed using the mentioned input strategies for the automotive domain and finally evaluated in a user study.