Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
interactions
Faster, cheaper!! Are usability inspection methods as effective as empirical testing?
Usability inspection methods
CHI '95 Conference Companion on Human Factors in Computing Systems
Remote evaluation: the network as an extension of the usability laboratory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote usability evaluation: can users report their own critical incidents?
CHI 98 Cconference Summary on Human Factors in Computing Systems
A comparison of usage evaluation and inspection methods for assessing groupware usability
GROUP '01 Proceedings of the 2001 International ACM SIGGROUP Conference on Supporting Group Work
In the lab and out in the wild: remote web usability testing for mobile devices
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
Usability remote evaluation for WWW
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Methods for Identifying Usability Problems with Web Sites
Proceedings of the IFIP TC2/TC13 WG2.7/WG13.4 Seventh Working Conference on Engineering for Human-Computer Interaction
HICSS '99 Proceedings of the Thirty-Second Annual Hawaii International Conference on System Sciences-Volume 2 - Volume 2
Remote evaluation for post-deployment usability improvement
AVI '98 Proceedings of the working conference on Advanced visual interfaces
A comparison of synchronous remote and local usability studies for an expert interface
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Instant data analysis: conducting usability evaluations in a day
Proceedings of the third Nordic conference on Human-computer interaction
Automated summative usability studies: an empirical evaluation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote usability evaluations With disabled people
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IV '06 Proceedings of the conference on Information Visualization
It's worth the hassle!: the added value of evaluating the usability of mobile systems in the field
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
What happened to remote usability testing?: an empirical study of three methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User experience at google: focus on the user and all else will follow
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Obstacles to usability evaluation in practice: a survey of software development organizations
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Journal of Systems and Software
Cultural differences in smartphone user experience evaluation
Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Feedlack detects missing feedback in web applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The impact of distraction in natural environments on user experience research
TPDL'11 Proceedings of the 15th international conference on Theory and practice of digital libraries: research and advanced technology for digital libraries
The effect of task assignments and instruction types on remote asynchronous usability testing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Backtracking Events as Indicators of Usability Problems in Creation-Oriented Applications
ACM Transactions on Computer-Human Interaction (TOCHI)
Evaluating a web-based tool for crowdsourced navigation stress tests
DUXU'13 Proceedings of the Second international conference on Design, User Experience, and Usability: web, mobile, and product design - Volume Part IV
Hi-index | 0.01 |
Remote asynchronous usability testing is characterized by both a spatial and temporal separation of users and evaluators. This has the potential both to reduce practical problems with securing user attendance and to allow direct involvement of users in usability testing. In this paper, we report from an empirical study where we systematically compared three methods for remote asynchronous usability testing: user-reported critical incidents, forum-based online reporting and discussion, and diary-based longitudinal user reporting. In addition, conventional laboratory-based think-aloud testing was included as a benchmark for the remote methods. The results show that each remote asynchronous method supports identification of a considerable number of usability problems. Although this is only about half of the problems identified with the conventional method, it requires significantly less time. This makes remote asynchronous methods an appealing possibility for usability testing in many software projects.