User interface evaluation in the real world: a comparison of four techniques
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Learning and using the cognitive walkthrough method: a case study approach
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Making a difference—the impact of inspections
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Helping and hindering user involvement — a tale of everyday design
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Evaluating a multimedia authoring tool
Journal of the American Society for Information Science - Special issue on current research in human-computer interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
A survey of user-centered design practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability Engineering
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
The human-computer interaction handbook
The human-computer interaction handbook
Evaluating usability methods: why the current literature fails the practitioner
interactions - The digital muse: HCI in support of creativity
Comparative usability evaluation
Behaviour & Information Technology
A grounded theory of the flow experiences of web users
International Journal of Human-Computer Studies - Incorporating knowledge acquisition
Describing usability problems: are we sending the right message?
interactions - All systems go: how Wall street will benefit from user-centered design
Two psychology-based usability inspection techniques studied in a diary experiment
Proceedings of the third Nordic conference on Human-computer interaction
Making a difference: a survey of the usability profession in Sweden
Proceedings of the third Nordic conference on Human-computer interaction
Beyond the UI: product, process and passion
Proceedings of the third Nordic conference on Human-computer interaction
Comparing usability problems and redesign proposals as input to practical systems development
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The stakeholder forest: designing an expenses application for the enterprise
CHI '05 Extended Abstracts on Human Factors in Computing Systems
CHI '05 Extended Abstracts on Human Factors in Computing Systems
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Making use of business goals in usability evaluation: an experiment with novice evaluators
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability evaluation methods in practice: understanding the context in which they are embedded
Proceedings of the 14th European conference on Cognitive ergonomics: invent! explore!
Evaluating system utility and conceptual fit using CASSM
International Journal of Human-Computer Studies
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Comparison of techniques for matching of usability problem descriptions
Interacting with Computers
Identifying Phenotypes and Genotypes: A Case Study Evaluating an In-Car Navigation System
Engineering Interactive Systems
Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload?
Behaviour & Information Technology
Cultural cognition in usability evaluation
Interacting with Computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Use case evaluation (UCE): a method for early usability evaluation in software development
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
Test case selection and prioritization: risk-based or design-based?
Proceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement
Work-domain knowledge in usability evaluation: Experiences with Cooperative Usability Testing
Journal of Systems and Software
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Analysis in usability evaluations: an exploratory study
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Discourse Variations Between Usability Tests and Usability Reports
Journal of Usability Studies
Sample size in usability studies
Communications of the ACM
Analysis in practical usability evaluation: a survey study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The usability expert's fear of agility: an empirical study of global trends and emerging practices
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Changing perspectives on evaluation in HCI: past, present, and future
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Visual walkthrough as a tool for utility assessment in a usability test
BCS-HCI '13 Proceedings of the 27th International BCS Human Computer Interaction Conference
Hi-index | 0.02 |
Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that immediate analysis of observations made in the think-aloud sessions is done only sporadically, if at all. When testing, evaluators seem to seek confirmation of problems that they are already aware of. During testing, evaluators often ask users about their expectations and about hypothetical situations, rather than about experienced problems. In addition, evaluators learn much about the usability of the tested system but little about its utility. The study shows how practical realities rarely discussed in the literature on usability evaluation influence sessions. We discuss implications for usability researchers and professionals, including techniques for fast-paced analysis and tools for capturing observations during sessions.