Comparing interfaces based on what users watch and do

  • Authors:
  • Eric C. Crowe;N. Hari Narayanan

  • Affiliations:
  • Computer Information Systems Department, DeVry Institute of Technology;Intelligent & Interactive Systems Laboratory, Computer Science & Software Eng. Dept., Auburn University, 107 Dunstan Hail, Auburn, AL

  • Venue:
  • ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the development of novel interfaces controlled through multiple modalities, new approaches are needed to analyze the process of interaction with such interfaces and evaluate them at a fine grain of detail. In order to evaluate the usability and usefulness of such interfaces, one needs tools to collect and analyze richly detailed data pertaining to both the process and outcomes of user interaction. Eye tracking is a technology that can provide detailed data on the allocation and shifts of users' visual attention across interface entities. Eye movement data, when combined with data from other input modalities (such as spoken commands, haptic actions with the keyboard and the mouse, etc.), results in just such a rich data on set. However, integrating, analyzing and visualizing multimodal data on user interactions remains a difficult task. In this paper we report on a first step toward developing a suite of tools to facilitate this task. We designed and implemented an Eye Tracking Analysis System that generates combined gaze and action visualizations from eye movement data and interaction logs. This new visualization allows an experimenter to see the visual attention shifts of users interleaved with their actions on each screen of a multi-screen interface. A pilot experiment on comparing two interfaces — a traditional interface and a speech-controlled one — to an educational multimedia application was carried out to test the utility of our tool.