An interactive method for accessing tables in HTML
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
Visual information foraging in a focus + context visualization
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Design of Speech Based-Devices
Design of Speech Based-Devices
The determinants of web page viewing behavior: an eye-tracking study
Proceedings of the 2004 symposium on Eye tracking research & applications
Eye-tracking analysis of user behavior in WWW search
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Issues in the Non-Visual Presentation of Graph Based Diagrams
IV '04 Proceedings of the Information Visualisation, Eighth International Conference
Rendering tables in audio: the interaction of structure and reading styles
Assets '04 Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility
Non-visual overviews of complex data sets
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Enabling an accessible web 2.0
W4A '07 Proceedings of the 2007 international cross-disciplinary conference on Web accessibility (W4A)
Ajax live regions: chat as a case example
W4A '07 Proceedings of the 2007 international cross-disciplinary conference on Web accessibility (W4A)
Audio presentation of auto-suggest lists
Proceedings of the 2009 International Cross-Disciplinary Conference on Web Accessibililty (W4A)
Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A)
Dynamic injection of WAI-ARIA into web content
Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility
Hi-index | 0.00 |
Presenting Web content through screen readers can be a challenging task, but this is the only means of access for many blind and visually impaired users. The difficulties are more acute when the information forms part of an interactive process, such as the increasingly common “Web 2.0 applications”. If the process is to be completed correctly and efficiently it is vital that appropriate information is given to the user at an appropriate time. Designing a non-visual interface that achieves these aims is a non-trivial task, for which several approaches are possible. The one taken here is to use eye-tracking to understand how sighted users interact with the content, and to gain insight into how they benefit from the information, then apply this understanding to design a non-visual user interface. This paper describes how this technique was applied to develop audio interfaces for two common types of interaction—auto-suggest lists and pop-up calendars. Although the resulting interfaces were quite different, one largely mirroring the visual representation and the other not, evaluations showed that the approach was effective, with both audio implementations effective and popular with participants.