Readings in information visualization
Creating creativity: user interfaces for supporting innovation
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction in the new millennium, Part 1
The challenge of information visualization evaluation
Proceedings of the working conference on Advanced visual interfaces
An Insight-Based Longitudinal Study of Visual Analytics
IEEE Transactions on Visualization and Computer Graphics
BELIV'06: beyond time and errors; novel evaluation methods for information visualization
interactions - Business leadership and the UX manager
Mapping the users'problem solving strategies in the participatory design of visual analytics methods
USAB'10 Proceedings of the 6th international conference on HCI in work and learning, life and leisure: workgroup human-computer interaction and usability engineering
Using gaze data in evaluating interactive visualizations
HCIV'09 Proceedings of the Second IFIP WG 13.7 conference on Human-computer interaction and visualization
Many roads lead to Rome: mapping users' problem solving strategies
Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization
Many roads lead to Rome: mapping users' problem-solving strategies
Information Visualization - Special issue on Evaluation for Information Visualization
Journal of the American Society for Information Science and Technology
Evaluation methods for creativity support environments
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
Information visualization systems allow users to produce insights, innovations and discoveries. Evaluating such tools is a challenging task and the goal of BELIV'08 is to make a step ahead in the comprehension of such a complex activity. Current evaluation methods exhibit noticeable limitations and researchers in the area experiment some frustration with evaluation processes that are time consuming and too often leading to unsatisfactory results. The most used evaluation metrics such as task time completion and number of errors appear insufficient to quantify the quality of an information visualization system; thus the name of the workshop: "beyond time and errors".