The Sandbox for analysis: concepts and methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Glass box: capturing, archiving, and retrieving workstation activities
Proceedings of the 3rd ACM workshop on Continuous archival and retrival of personal experences
Metrics for evaluating human information interaction systems
Interacting with Computers
Evaluation metrics and methodologies for user-centered evaluation of intelligent systems
Interacting with Computers
Promoting Insight-Based Evaluation of Visualizations: From Contest to Benchmark Repository
IEEE Transactions on Visualization and Computer Graphics
Questionnaires for eliciting evaluation data from users of interactive question answering systems
Natural Language Engineering
Recovering reasoning processes from user interactions
IEEE Computer Graphics and Applications - Special issue on sketching tangible interfaces augmented reality on mobile phones
Exploring information visualization: describing different interaction patterns
Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization
Proceedings of the twelfth international workshop on Web information and data management
Hi-index | 0.00 |
In this paper, we discuss the challenges involved in developing an infrastructure to support a new generation of analytic tools for information analysts. The infrastructure provides data for establishing context about what the analyst is doing with the analytic tools, supports an integration environment to allow suites of tools to work together, and supports evaluation of the analytic tools. We discuss the functionality of the Glass Box, the challenges of evaluating adaptive systems including the capture of data for evaluation metrics, and lessons learned from our experiences to date.