Remote evaluation: the network as an extension of the usability laboratory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
On the contributions of different empirical data in usability testing
DIS '00 Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques
Human Factors and Web Development
Human Factors and Web Development
Automated summative usability studies: an empirical evaluation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote usability evaluations With disabled people
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Évaluation des dispositifs mobiles: sur le terrain ou en laboratoire ?
IHM '06 Proceedings of the 18th International Conferenceof the Association Francophone d'Interaction Homme-Machine
What happened to remote usability testing?: an empirical study of three methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Let your users do the testing: a comparison of three remote asynchronous usability testing methods
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Testing remote users: an innovative technology
UI-HCII'07 Proceedings of the 2nd international conference on Usability and internationalization
Usability Testing Essentials: Ready, Set...Test!
Usability Testing Essentials: Ready, Set...Test!
Journal of Systems and Software
Synchronous remote usability testing: a new approach facilitated by virtual worlds
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Testing touch: emulators vs. devices
IDGD'11 Proceedings of the 4th international conference on Internationalization, design and global development
Remote usability testing using eyetracking
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
Comparing benchmark task and insight evaluation methods on timeseries graph visualizations
Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization
The effect of task assignments and instruction types on remote asynchronous usability testing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A comparison of benchmark task and insight evaluation methods for information visualization
Information Visualization - Special issue on Evaluation for Information Visualization
Hi-index | 0.00 |
Synchronous remote usability studies can be a convenient and cost-effective alternative to conventional local usability studies. Although they are common in the field, there has been little research comparing synchronous remote usability studies with local studies. In our comparison of remote and local studies of an expert interface, the primary differences were in the participant's and facilitator's qualitative experience. The remote and local studies agreed closely (with no significant differences) in terms of the number of usability issues found, their type, and their severity. While our comparison focuses on an expert interface and more work is needed to understand remote studies in general, our experience suggests that evaluators of expert interfaces will have comparable success identifying usability issues with either remote or local studies.