Speech-augmented eye gaze interaction with small closely spaced targets

  • Authors:
  • Darius Miniotas;Oleg Špakov;Ivan Tugoy;I. Scott MacKenzie

  • Affiliations:
  • Šiauliai University, Šiauliai, Lithuania;University of Tampere, Tampere, Finland;University of Tampere, Tampere, Finland;York University, Toronto, Canada

  • Venue:
  • Proceedings of the 2006 symposium on Eye tracking research & applications
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Eye trackers have been used as pointing devices for a number of years. Due to inherent limitations in the accuracy of eye gaze, however, interaction is limited to objects spanning at least one degree of visual angle. Consequently, targets in gaze-based interfaces have sizes and layouts quite distant from "natural settings". To accommodate accuracy constraints, we developed a multimodal pointing technique combining eye gaze and speech inputs. The technique was tested in a user study on pointing at multiple targets. Results suggest that in terms of a footprint-accuracy tradeoff, pointing performance is best (~93%) for targets subtending 0.85 degrees with 0.3-degree gaps between them. User performance is thus shown to approach the limit of practical pointing. Effectively, developing a user interface that supports hands-free interaction and has a design similar to today's common interfaces is feasible.