Gaze input for mobile devices by dwell and gestures

  • Authors:
  • Morten Lund Dybdal;Javier San Agustin;John Paulin Hansen

  • Affiliations:
  • IT University of Copenhagen, Rued Langgaardsvej, Copenhagen S Denmark;IT University of Copenhagen, Rued Langgaardsvej, Copenhagen S, Denmark;IT University of Copenhagen, Rued Langgaardsvej, Copenhagen S, Denmark

  • Venue:
  • Proceedings of the Symposium on Eye Tracking Research and Applications
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper investigates whether it is feasible to interact with the small screen of a smartphone using eye movements only. Two of the most common gaze-based selection strategies, dwell time selections and gaze gestures are compared in a target selection experiment. Finger-strokes and accelerometer-based interaction, i. e. tilting, are also considered. In an experiment with 11 subjects we found gaze interaction to have a lower performance than touch interaction but comparable to the error rate and completion time of accelerometer (i.e. tilt) interaction. Gaze gestures had a lower error rate and were faster than dwell selections by gaze, especially for small targets, suggesting that this method may be the best option for hands-free gaze control of smartphones.