Remote evaluation: the network as an extension of the usability laboratory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Adaptation of Traditional Usability Testing Methods for Remote Testing
HICSS '01 Proceedings of the 34th Annual Hawaii International Conference on System Sciences ( HICSS-34)-Volume 5 - Volume 5
A comparison of synchronous remote and local usability studies for an expert interface
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Here, there, anywhere: remote usability testing that works
CITC5 '04 Proceedings of the 5th conference on Information technology education
What happened to remote usability testing?: an empirical study of three methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Measuring Presence in Virtual Environments: A Presence Questionnaire
Presence: Teleoperators and Virtual Environments
Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems
Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems
Eye tracking within the packaging design workflow: interaction with physical and virtual shelves
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Changing perspectives on evaluation in HCI: past, present, and future
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.01 |
This study proposes a new methodology for conducting synchronous remote usability studies using a three-dimensional virtual usability testing laboratory built using the Open Wonderland toolkit. This virtual laboratory method is then compared with two other commonly used synchronous usability test methods: the traditional lab approach and WebEx, a web-based conferencing and screen sharing approach. A study was conducted with 48 participants in total, 36 test participants and 12 test facilitators. The test participants completed 5 tasks on a simulated e-commerce website. The three methodologies were compared with respect to the following dependent variables: the time taken to complete the tasks; the usability defects identified; the severity of these usability defects; and the subjective ratings from NASA-TLX, presence and post-test subjective questionnaires. The three methodologies agreed closely in terms of the total number defects identified, number of high severity defects identified and the time taken to complete the tasks. However, there was a significant difference in the workload experienced by the test participants and facilitators, with the traditional lab condition imposing the least and the virtual lab and the WebEx conditions imposing similar levels. It was also found that the test participants experienced greater involvement and a more immersive experience in the virtual world condition than the WebEx condition. These ratings were not significantly different from those in the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to the conventional approaches for synchronous usability testing.