The perspective wall: detail and context smoothly integrated
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Graphical fisheye views of graphs
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A magnifier tool for video data
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Heart rate variability: indicator of user state as an aid to human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Guidelines for using multiple views in information visualization
AVI '00 Proceedings of the working conference on Advanced visual interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Simplifying video editing with SILVER
CHI '02 Extended Abstracts on Human Factors in Computing Systems
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
Navigation patterns and usability of zoomable user interfaces with and without an overview
ACM Transactions on Computer-Human Interaction (TOCHI)
Visualizing Application Behavior on Superscalar Processors
INFOVIS '99 Proceedings of the 1999 IEEE Symposium on Information Visualization
Towards an index of opportunity: understanding changes in mental workload during task execution
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
TAPRAV: An interactive analysis tool for exploring workload aligned to models of task execution
Interacting with Computers
Hi-index | 0.00 |
Existing analysis tools are not sufficient for exploring pupillary response, as the data typically needs to be explored in relation to the corresponding task's execution. To address this need, we have developed an interactive visualization tool called TAPRAV. Key components include (i) a visualization of the pupillary response aligned to the model of task execution, useful for making sense of the overall data set; (ii) an interactive overview+detail metaphor, enabling rapid inspection of details; (iii) synchronization with the video of screen interaction, providing awareness of the state of the task; and (iv) interaction supporting discovery driven analysis.