Coda: A Highly Available File System for a Distributed Workstation Environment
IEEE Transactions on Computers
Using the thinking-aloud method in system development
Proceedings of the third international conference on human-computer interaction on Designing and using human-computer interfaces and knowledge based systems (2nd ed.)
User interface evaluation in the real world: a comparison of four techniques
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Disconnected operation in the Coda File System
ACM Transactions on Computer Systems (TOCS)
Evaluating the thinking-aloud technique for use by computer scientists
Advances in human-computer interaction (vol. 3)
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Faster, cheaper!! Are usability inspection methods as effective as empirical testing?
Usability inspection methods
Exploiting weak connectivity for mobile file access
SOSP '95 Proceedings of the fifteenth ACM symposium on Operating systems principles
Evaluating a multimedia authoring tool
Journal of the American Society for Information Science - Special issue on current research in human-computer interaction
The keystroke-level model for user performance time with interactive systems
Communications of the ACM
Usability Engineering
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
Translucent cache management for mobile computing
Translucent cache management for mobile computing
Human Problem Solving
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
The importance of translucence in mobile computing systems
ACM Transactions on Computer-Human Interaction (TOCHI)
A comparison of synchronous remote and local usability studies for an expert interface
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Usability benchmarking case study: media downloads via mobile phones in the US
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Usability inspection methods after 15 years of research and practice
SIGDOC '07 Proceedings of the 25th annual ACM international conference on Design of communication
Tracing impact in a usability improvement process
Interacting with Computers
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Hi-index | 0.00 |
Many sources of empirical data can be used to evaluate an interface (e.g., time to learn, time to perform benchmark tasks, number of errors on benchmark tasks, answers on questionnaires, comments made in verbal protocols). This paper examines the relative contributions of both quanti冒ta冒tive and qualitative data gathered during a usability study. For each usability problem uncovered by this study, we trace each contributing piece of evidence back to its empirical source. For this usability study, the verbal protocol provided the sole source of evidence for more than one third of the most severe problems and more than two thirds of the less severe problems. Thus, although the verbal protocol provided the bulk of the evidence, other sources of data contributed disproportionately to the more critical problems. This work suggests that further research is required to determine the relative value of different forms of empirical evidence.