Designing gaze-supported multimodal interactions for the exploration of large image collections

  • Authors:
  • Sophie Stellmach;Sebastian Stober;Andreas Nürnberger;Raimund Dachselt

  • Affiliations:
  • University of Magdeburg, Germany;University of Magdeburg, Germany;University of Magdeburg, Germany;University of Magdeburg, Germany

  • Venue:
  • Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.