The perspective wall: detail and context smoothly integrated
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Graphical fisheye views of graphs
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A magnifier tool for video data
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Communications of the ACM
interactions
The GOMS family of user interface analysis techniques: comparison and contrast
ACM Transactions on Computer-Human Interaction (TOCHI)
Heart rate variability: indicator of user state as an aid to human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Guidelines for using multiple views in information visualization
AVI '00 Proceedings of the working conference on Advanced visual interfaces
The response of eye-movement and pupil size to audio instruction while viewing a moving target
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Focus plus context screens: combining display technology with visualization techniques
Proceedings of the 14th annual ACM symposium on User interface software and technology
The state of the art in automating usability evaluation of user interfaces
ACM Computing Surveys (CSUR)
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Simplifying video editing with SILVER
CHI '02 Extended Abstracts on Human Factors in Computing Systems
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
Navigation patterns and usability of zoomable user interfaces with and without an overview
ACM Transactions on Computer-Human Interaction (TOCHI)
ConcurTaskTrees: A Diagrammatic Notation for Specifying Task Models
INTERACT '97 Proceedings of the IFIP TC13 Interantional Conference on Human-Computer Interaction
Visualizing Application Behavior on Superscalar Processors
INFOVIS '99 Proceedings of the 1999 IEEE Symposium on Information Visualization
Tree-Maps: a space-filling approach to the visualization of hierarchical information structures
VIS '91 Proceedings of the 2nd conference on Visualization '91
New techniques for evaluating innovative interfaces with eye tracking
Proceedings of the 5th international conference on Multimodal interfaces
Task-evoked pupillary response to mental workload in human-computer interaction
CHI '04 Extended Abstracts on Human Factors in Computing Systems
When do we interact multimodally?: cognitive load and multimodal communication patterns
Proceedings of the 6th international conference on Multimodal interfaces
BEST PAPER: A Knowledge Task-Based Framework for Design and Evaluation of Information Visualizations
INFOVIS '04 Proceedings of the IEEE Symposium on Information Visualization
Towards an index of opportunity: understanding changes in mental workload during task execution
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '05 Extended Abstracts on Human Factors in Computing Systems
TAPRAV: a tool for exploring workload aligned to models of task execution
Proceedings of the working conference on Advanced visual interfaces
Hi-index | 0.00 |
Pupillary response is a valid indicator of mental workload and is being increasingly leveraged to identify lower cost moments for interruption, evaluate complex interfaces, and develop further understanding of psychological processes. Existing tools are not sufficient for analyzing this type of data, as it typically needs to be analyzed in relation to the corresponding task's execution. To address this emerging need, we have developed a new interactive analysis tool, TAPRAV. The primary components of the tool include; (i) a visualization of pupillary response aligned to the corresponding model of task execution, useful for exploring relationships between these two data sources; (ii) an interactive overview+detail metaphor, enabling rapid inspection of details while maintaining global context; (iii) synchronized playback of the video of the user's screen interaction, providing awareness of the state of the task; and (iv) interaction supporting discovery driven analysis. Results from a user study showed that users are able to efficiently interact with the tool to analyze relationships between pupillary response and task execution. The primary contribution of our tool is that it demonstrates an effective visualization and interaction design for rapidly exploring pupillary response in relation to models of task execution, thereby reducing the analysis effort.