Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays

  • Authors:
  • Florian Alt;Stefan Schneegass;Jonas Auda;Rufat Rzayev;Nora Broy

  • Affiliations:
  • University of Munich, München, Germany;University of Stuttgart, Stuttgart, Germany;University of Stuttgart, Stuttgart, Germany;University of Stuttgart, Stuttgart, Germany;BMW Research and Technology, Munich, Germany

  • Venue:
  • Proceedings of the 19th international conference on Intelligent User Interfaces
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we investigate the concept of gaze-based interaction with 3D user interfaces. We currently see stereo vision displays becoming ubiquitous, particularly as auto-stereoscopy enables the perception of 3D content without the use of glasses. As a result, application areas for 3D beyond entertainment in cinema or at home emerge, including work settings, mobile phones, public displays, and cars. At the same time, eye tracking is hitting the consumer market with low-cost devices. We envision eye trackers in the future to be integrated with consumer devices (laptops, mobile phones, displays), hence allowing the user's gaze to be analyzed and used as input for interactive applications. A particular challenge when applying this concept to 3D displays is that current eye trackers provide the gaze point in 2D only (x and y coordinates). In this paper, we compare the performance of two methods that use the eye's physiology for calculating the gaze point in 3D space, hence enabling gaze-based interaction with stereoscopic content. Furthermore, we provide a comparison of gaze interaction in 2D and 3D with regard to user experience and performance. Our results show that with current technology, eye tracking on stereoscopic displays is possible with similar performance as on standard 2D screens.