Replacing usability testing with user dialogue
Communications of the ACM
A survey of user-centered design practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability Engineering
The human-computer interaction handbook
Two psychology-based usability inspection techniques studied in a diary experiment
Proceedings of the third Nordic conference on Human-computer interaction
What do usability evaluators do in practice?: an explorative study of think-aloud testing
DIS '06 Proceedings of the 6th conference on Designing Interactive systems
Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload?
Behaviour & Information Technology
The usability inspection performance of work-domain experts: An empirical study
Interacting with Computers
Understanding usability practices in complex domains
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Video annotation and navigation on mobile devices
Proceedings of the 18th Brazilian symposium on Multimedia and the web
Hi-index | 0.00 |
Recent criticism of think-aloud testing (TA) discusses discrepancies between theory and practice, the artificiality of the test situation, and inconsistencies in the evaluators' interpretation of the process. Rather than enforcing a more strict TA procedure, we describe Cooperative Usability Testing (CUT), where test users and evaluators join expertise to understand the usability problems of the application evaluated. CUT consists of two sessions. In the interaction session, the test user tries out the application to uncover potential usability problems while the evaluators mainly observe, e.g. as in TA or contextual inquiry. In the interpretation session, evaluators and test users discuss what they consider the most important usability problems, supported by a video of the interaction session. In an exploratory study comparing CUT to TA, seven evaluators find that interpretation sessions contribute important usability information compared to TA. Also test users found participation in the interpretation session interesting.