Using qualitative eye-tracking data to inform audio presentation of dynamic Web content

  • Authors:
  • Andy Brown;Caroline Jay;Simon Harper

  • Affiliations:
  • School of Computer Science, University of Manchester, Manchester, UK;School of Computer Science, University of Manchester, Manchester, UK;School of Computer Science, University of Manchester, Manchester, UK

  • Venue:
  • The New Review of Hypermedia and Multimedia - Web Accessibility
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Presenting Web content through screen readers can be a challenging task, but this is the only means of access for many blind and visually impaired users. The difficulties are more acute when the information forms part of an interactive process, such as the increasingly common “Web 2.0 applications”. If the process is to be completed correctly and efficiently it is vital that appropriate information is given to the user at an appropriate time. Designing a non-visual interface that achieves these aims is a non-trivial task, for which several approaches are possible. The one taken here is to use eye-tracking to understand how sighted users interact with the content, and to gain insight into how they benefit from the information, then apply this understanding to design a non-visual user interface. This paper describes how this technique was applied to develop audio interfaces for two common types of interaction—auto-suggest lists and pop-up calendars. Although the resulting interfaces were quite different, one largely mirroring the visual representation and the other not, evaluations showed that the approach was effective, with both audio implementations effective and popular with participants.