Understanding computers and cognition
Understanding computers and cognition
The cognitive coprocessor architecture for interactive user interfaces
UIST '89 Proceedings of the 2nd annual ACM SIGGRAPH symposium on User interface software and technology
Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Computer analysis of user interfaces based on repetition in transcripts of user sessions
ACM Transactions on Information Systems (TOIS)
Refining the test phase of usability evaluation: how many subjects is enough?
Human Factors - Special issue: measurement in human factors
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A mathematical model of the finding of usability problems
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Comparative usability evaluation: critical incidents and critical threads
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote evaluation: the network as an extension of the usability laboratory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The evaluator effect in usability tests
CHI 98 Cconference Summary on Human Factors in Computing Systems
The user action framework: a reliable foundation for usability engineering support tools
International Journal of Human-Computer Studies
Extracting usability information from user interface events
ACM Computing Surveys (CSUR)
The state of the art in automating usability evaluation of user interfaces
ACM Computing Surveys (CSUR)
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
Interface-Walkthroughs: Efficient Collaborative Testing
IEEE Software
Who Ya Gonna Call? You're on Your Own
IEEE Software
Testing web sites: five users is nowhere near enough
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Evaluating usability methods: why the current literature fails the practitioner
interactions - The digital muse: HCI in support of creativity
Remote evaluation for post-deployment usability improvement
AVI '98 Proceedings of the working conference on Advanced visual interfaces
Analysis of combinatorial user effect in international usability tests
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Predictive human performance modeling made easy
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Supporting problem identification in usability evaluations
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
The validity of the stimulated retrospective think-aloud method as measured by eye tracking
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability testing: what have we overlooked?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Practical guide to controlled experiments on the web: listen to your customers not to the hippo
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Comparison of techniques for matching of usability problem descriptions
Interacting with Computers
Undo and erase events as indicators of usability problems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Let your users do the testing: a comparison of three remote asynchronous usability testing methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics
Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics
Usability Testing Essentials: Ready, Set...Test!
Usability Testing Essentials: Ready, Set...Test!
Participatory usability: supporting proactive users
CHINZ '03 Proceedings of the 4th Annual Conference of the ACM Special Interest Group on Computer-Human Interaction
Hi-index | 0.00 |
A diversity of user goals and strategies make creation-oriented applications such as word processors or photo-editors difficult to comprehensively test. Evaluating such applications requires testing a large pool of participants to capture the diversity of experience, but traditional usability testing can be prohibitively expensive. To address this problem, this article contributes a new usability evaluation method called backtracking analysis, designed to automate the process of detecting and characterizing usability problems in creation-oriented applications. The key insight is that interaction breakdowns in creation-oriented applications often manifest themselves in backtracking operations that can be automatically logged (e.g., undo and erase operations). Backtracking analysis synchronizes these events to contextual data such as screen capture video, helping the evaluator to characterize specific usability problems. The results from three experiments demonstrate that backtracking events can be effective indicators of usability problems in creation-oriented applications, and can yield a cost-effective alternative to traditional laboratory usability testing.