Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Communications of the ACM - Special issue Participatory Design
What is gained and lost when using evaluation methods other than empirical testing
HCI'92 Proceedings of the conference on People and computers VII
Communications of the ACM
Making a difference—the impact of inspections
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Contextual design: defining customer-centered systems
Contextual design: defining customer-centered systems
A toolkit for strategic usability: results from workshops, panels, and surveys
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
A survey of user-centered design practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Context of use within usability activities
International Journal of Human-Computer Studies
Methods to support human-centred design
International Journal of Human-Computer Studies
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
The human-computer interaction handbook
Survey on the UCD integration in the industry
Proceedings of the third Nordic conference on Human-computer interaction
Discovering Statistics Using SPSS
Discovering Statistics Using SPSS
Comparing usability problems and redesign proposals as input to practical systems development
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Assessing the applicability of the structured expert evaluation method (SEEM) for a wider age group
Proceedings of the 2006 conference on Interaction design and children
What do usability evaluators do in practice?: an explorative study of think-aloud testing
DIS '06 Proceedings of the 6th conference on Designing Interactive systems
Tracing impact in a usability improvement process
Interacting with Computers
Handbook of Usability TestingXXX: Howto Plan, Design, and Conduct Effective Tests
Handbook of Usability TestingXXX: Howto Plan, Design, and Conduct Effective Tests
Experiences with structured interviewing of children during usability tests
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 1
Usability evaluation of digital dictation procedure - an interaction analysis approach
USAB'11 Proceedings of the 7th conference on Workgroup Human-Computer Interaction and Usability Engineering of the Austrian Computer Society: information Quality in e-Health
The effect of task assignments and instruction types on remote asynchronous usability testing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Analysis in practical usability evaluation: a survey study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Test strategies in distributed software development environments
Computers in Industry
Proceedings of the 24th Australian Computer-Human Interaction Conference
Hi-index | 0.00 |
Usability evaluation helps to determine whether interactive systems support users in their work tasks. However, knowledge about those tasks and, more generally, about the work-domain is difficult to bring to bear on the processes and outcome of usability evaluation. One way to include such work-domain knowledge might be Cooperative Usability Testing, an evaluation method that consists of (a) interaction phases, similar to classic usability testing, and (b) interpretation phases, where the test participant and the moderator discuss incidents and experiences from the interaction phases. We have studied whether such interpretation phases improve the relevance of usability evaluations in the development of work-domain specific systems. The study included two development cases. We conclude that the interpretation phases generate additional insight and redesign suggestions related to observed usability problems. Also, the interpretation phases generate a substantial proportion of new usability issues, thereby providing a richer evaluation output. Feedback from the developers of the evaluated systems indicates that the usability issues that are generated in the interpretation phases have substantial impact on the software development process. The benefits of the interpretation phases may be explained by the access these provide both to the test participants' work-domain knowledge and to their experiences as users.