Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
interactions
Remote evaluation: the network as an extension of the usability laboratory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote usability evaluation: can users report their own critical incidents?
CHI 98 Cconference Summary on Human Factors in Computing Systems
Who is an open source software developer?
Communications of the ACM - Ontology: different ways of representing the same concept
Getting to know you: open source development meets usability
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
Usability remote evaluation for WWW
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Adaptation of Traditional Usability Testing Methods for Remote Testing
HICSS '01 Proceedings of the 34th Annual Hawaii International Conference on System Sciences ( HICSS-34)-Volume 5 - Volume 5
Remote evaluation for post-deployment usability improvement
AVI '98 Proceedings of the working conference on Advanced visual interfaces
Professional usability in open source projects: GNOME, OpenOffice.org, NetBeans
CHI '04 Extended Abstracts on Human Factors in Computing Systems
A comparison of synchronous remote and local usability studies for an expert interface
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Methodology for remote usability activities: A case study
IBM Systems Journal
Here, there, anywhere: remote usability testing that works
CITC5 '04 Proceedings of the 5th conference on Information technology education
A Repeatable Collaboration Process for Usability Testing
HICSS '05 Proceedings of the Proceedings of the 38th Annual Hawaii International Conference on System Sciences (HICSS'05) - Track 1 - Volume 01
Does time heal?: a longitudinal study of usability
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
Supporting problem identification in usability evaluations
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
Remote usability evaluations With disabled people
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability inspection methods after 15 years of research and practice
SIGDOC '07 Proceedings of the 25th annual ACM international conference on Design of communication
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Let your users do the testing: a comparison of three remote asynchronous usability testing methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Framework for Remote User Evaluation of Accessibility and Usability of Websites
UAHCI '09 Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction. Addressing Diversity. Part I: Held as Part of HCI International 2009
Remote evaluation of mobile applications
TAMODIA'07 Proceedings of the 6th international conference on Task models and diagrams for user interface design
USAB'07 Proceedings of the 3rd Human-computer interaction and usability engineering of the Austrian computer society conference on HCI and usability for medicine and health care
Journal of Systems and Software
MoPeDT: features and evaluation of a user-centred prototyping tool
Proceedings of the 2nd ACM SIGCHI symposium on Engineering interactive computing systems
Proceedings of the 7th International Conference on Methods and Techniques in Behavioral Research
Exploratory inspection—a user-based learning method for improving open source software usability
Journal of Software Maintenance and Evolution: Research and Practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Synchronous remote usability testing: a new approach facilitated by virtual worlds
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote usability testing using eyetracking
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
An open source usability maturity model (OS-UMM)
Computers in Human Behavior
The effect of task assignments and instruction types on remote asynchronous usability testing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards an automatic evaluation of web applications
Proceedings of the 27th Annual ACM Symposium on Applied Computing
USABILICS: avaliação remota de usabilidade e métricas baseadas na análise de tarefas
Proceedings of the 10th Brazilian Symposium on on Human Factors in Computing Systems and the 5th Latin American Conference on Human-Computer Interaction
Journal of the American Society for Information Science and Technology
Tool-Supported User-Centred Prototyping of Mobile Applications
International Journal of Handheld Computing Research
Changing perspectives on evaluation in HCI: past, present, and future
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.01 |
The idea of conducting usability tests remotely emerged ten years ago. Since then, it has been studied empirically, and some software organizations employ remote methods. Yet there are still few comparisons involving more than one remote method. This paper presents results from a systematic empirical comparison of three methods for remote usability testing and a conventional laboratory-based think-aloud method. The three remote methods are a remote synchronous condition, where testing is conducted in real time but the test monitor is separated spatially from the test subjects, and two remote asynchronous conditions, where the test monitor and the test subjects are separated both spatially and temporally. The results show that the remote synchronous method is virtually equivalent to the conventional method. Thereby, it has the potential to conveniently involve broader user groups in usability testing and support new development approaches. The asynchronous methods are considerably more time-consuming for the test subjects and identify fewer usability problems, yet they may still be worthwhile.