A comparison of speech and mouse/keyboard GUI navigation
CHI '95 Conference Companion on Human Factors in Computing Systems
Designing effective multimedia presentations
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
101 spots, or how do users read menus?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Inferring intent in eye-based interfaces: tracing eye movements with process models
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Eye tracking the visual search of click-down menus
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Patterns of entry and correction in large vocabulary continuous speech recognition systems
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Visual interest and NPR: an evaluation and manifesto
Proceedings of the 3rd international symposium on Non-photorealistic animation and rendering
Temporal Thumbnails: rapid visualization of time-based viewing data
Proceedings of the working conference on Advanced visual interfaces
Comparing Usage Performance on Mobile Applications
Groupware: Design, Implementation, and Use
Visual exploration of eye movement data using the space-time-cube
GIScience'10 Proceedings of the 6th international conference on Geographic information science
A descriptive model of visual scanning
Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization
Visual scanning as a reference framework for interactive representation design
Information Visualization - Special issue on Evaluation for Information Visualization
Hi-index | 0.00 |
With the development of novel interfaces controlled through multiple modalities, new approaches are needed to analyze the process of interaction with such interfaces and evaluate them at a fine grain of detail. In order to evaluate the usability and usefulness of such interfaces, one needs tools to collect and analyze richly detailed data pertaining to both the process and outcomes of user interaction. Eye tracking is a technology that can provide detailed data on the allocation and shifts of users' visual attention across interface entities. Eye movement data, when combined with data from other input modalities (such as spoken commands, haptic actions with the keyboard and the mouse, etc.), results in just such a rich data on set. However, integrating, analyzing and visualizing multimodal data on user interactions remains a difficult task. In this paper we report on a first step toward developing a suite of tools to facilitate this task. We designed and implemented an Eye Tracking Analysis System that generates combined gaze and action visualizations from eye movement data and interaction logs. This new visualization allows an experimenter to see the visual attention shifts of users interleaved with their actions on each screen of a multi-screen interface. A pilot experiment on comparing two interfaces — a traditional interface and a speech-controlled one — to an educational multimedia application was carried out to test the utility of our tool.