Usability testing of a graphical programming system: things we missed in a programming walkthrough
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability testing in the field: bringing the laboratory to the user
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Adaptation of an ethnographic method for investigation of the task domain in diagnostic radiology
Field methods casebook for software design
What architects see in their sketches: implications for design tools
Conference Companion on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interaction Design
Human-Computer Interaction
Concurrent vs. post-task usability test ratings
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Measuring errors in text entry tasks: an application of the Levenshtein string distance statistic
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A qualitative cross-national study of cultural influences on mobile data service design
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Thinking but not seeing: think-aloud for non-sighted users
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Beyond current user research: designing methods for new users, technologies, and design processes
CHI '07 Extended Abstracts on Human Factors in Computing Systems
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Getting the message across: visual attention, aesthetic design and what users remember
Proceedings of the 7th ACM conference on Designing interactive systems
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Design Reasoning Improves Software Design Quality
QoSA '08 Proceedings of the 4th International Conference on Quality of Software-Architectures: Models and Architectures
Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload?
Behaviour & Information Technology
Undo and erase events as indicators of usability problems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Methods for Evaluating Interactive Information Retrieval Systems with Users
Foundations and Trends in Information Retrieval
Exploring the link between player involvement and learning within digital games
Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology
UI-HCII'07 Proceedings of the 2nd international conference on Usability and internationalization
Tracing cognitive processes for usability evaluation: a cross cultural mind tape study
UI-HCII'07 Proceedings of the 2nd international conference on Usability and internationalization
Investigating user attention and interest in websites
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
Empirical Software Engineering
Using learning styles and viewing styles in streaming video
Computers & Education
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
Gaze-augmented think-aloud as an aid to learning
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Children's information retrieval: beyond examining search strategies and interfaces
FDIA'08 Proceedings of the 2nd BCS IRSG conference on Future Directions in Information Access
Backtracking Events as Indicators of Usability Problems in Creation-Oriented Applications
ACM Transactions on Computer-Human Interaction (TOCHI)
Footprint tracker: supporting diary studies with lifelogging
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability assessment in the multicultural approach
DUXU'13 Proceedings of the Second international conference on Design, User Experience, and Usability: health, learning, playing, cultural, and cross-cultural user experience - Volume Part II
Hi-index | 0.01 |
Retrospective Think aloud (RTA) is a usability method that collects the verbalization of a user's performance after the performance is over. There has been little work done to investigate the validity and reliability of RTA. This paper reports on an experiment investigating these issues with a form of the method called stimulated RTA. By comparing subjects' verbalizations with their eye movements, we support the validity and reliability of stimulated RTA: the method provides a valid account of what people attended to in completing tasks, it has a low risk of introducing fabrications, and its validity isn't affected by task complexity. More detailed analysis of RTA shows that it also provides additional information about user's inferences and strategies in completing tasks. The findings of this study provide valuable support for usability practitioners to use RTA and to trust the users' performance information collected by this method in a usability study.