A cognitive model for the perception and understanding of graphs
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Readings in information visualization: using vision to think
Readings in information visualization: using vision to think
Empirical evaluation of information visualizations: an introduction
International Journal of Human-Computer Studies - Empirical evaluation of information visualizations
Graph Visualization and Navigation in Information Visualization: A Survey
IEEE Transactions on Visualization and Computer Graphics
Keynote Lecture: Leveraging Human Capabilities in Information Perceptualization
IV '00 Proceedings of the International Conference on Information Visualisation
Cognitive measurements of graph aesthetics
Information Visualization
Human Factors in Visualization Research
IEEE Transactions on Visualization and Computer Graphics
The challenge of information visualization evaluation
Proceedings of the working conference on Advanced visual interfaces
A Comparison of the Readability of Graphs Using Node-Link and Matrix-Based Representations
INFOVIS '04 Proceedings of the IEEE Symposium on Information Visualization
Single complex glyphs versus multiple simple glyphs
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Knowledge Precepts for Design and Evaluation of Information Visualizations
IEEE Transactions on Visualization and Computer Graphics
Toward Measuring Visualization Insight
IEEE Computer Graphics and Applications
The Perceptual Scalability of Visualization
IEEE Transactions on Visualization and Computer Graphics
Supporting the analytical reasoning process in information visualization
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Shaping of Information by Visual Metaphors
IEEE Transactions on Visualization and Computer Graphics
How important is the "Mental map"?: an empirical investigation of a dynamic graph layout algorithm
GD'06 Proceedings of the 14th international conference on Graph drawing
Human-centered visualization environments
Human-centered visualization environments
Exploring the relative importance of crossing number and crossing angle
Proceedings of the 3rd International Symposium on Visual Information Communication
Cognitive load and usability analysis of R-MAP for the people who are blind or visual impaired
Proceedings of the 29th ACM international conference on Design of communication
Perception of Animated Node-Link Diagrams for Dynamic Graphs
Computer Graphics Forum
Eye tracking for visualization evaluation: reading values on linear versus radial graphs
Information Visualization - Special issue on Evaluation for Information Visualization
Rectangle orientation in area judgment task for treemap design
Proceedings of the 24th Australian Computer-Human Interaction Conference
Towards a 3-dimensional model of individual cognitive differences: position paper
Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization
GD'12 Proceedings of the 20th international conference on Graph Drawing
Using fNIRS brain sensing to evaluate information visualization interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Visualizing large trees with divide & conquer partition
Proceedings of the 6th International Symposium on Visual Information Communication and Interaction
Improving multiple aesthetics produces better graph drawings
Journal of Visual Languages and Computing
Hi-index | 0.00 |
Graph visualizations are typically evaluated by comparing their differences in effectiveness, measured by task performance such as response time and accuracy. Such performance-based measures have proved to be useful in their own right. There are some situations, however, where the performance measures alone may not be sensitive enough to detect differences. This limitation can be seen from the fact that the graph viewer may achieve the same level of performance by devoting different amounts of cognitive effort. In addition, it is not often that individual performance measures are consistently in favor of a particular visualization. This makes design and evaluation difficult in choosing one visualization over another. In an attempt to overcome the above-mentioned limitations, we measure the effectiveness of graph visualizations from a cognitive load perspective. Human memory as an information processing system and recent results from cognitive load research are reviewed first. The construct of cognitive load in the context of graph visualization is proposed and discussed. A model of user task performance, mental effort and cognitive load is proposed thereafter to further reveal the interacting relations between these three concepts. A cognitive load measure called mental effort is introduced and this measure is further combined with traditional performance measures into a single multi-dimensional measure called visualization efficiency. The proposed model and measurements are tested in a user study for validity. Implications of the cognitive load considerations in graph visualization are discussed.