Usability engineering at a discount
Proceedings of the third international conference on human-computer interaction on Designing and using human-computer interfaces and knowledge based systems (2nd ed.)
Mis-usability: on the uses and misuses of usability testing
Proceedings of the 20th annual international conference on Computer documentation
Alternative methods for field usability research
Proceedings of the 21st annual international conference on Documentation
Being Literate with Large Document Collections: Observational Studies and Cost Structure Tradeoffs
HICSS '06 Proceedings of the 39th Annual Hawaii International Conference on System Sciences - Volume 03
Usability testing: what have we overlooked?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Search User Interfaces
Why is web search so hard... to evaluate?
Journal of Web Engineering
Hi-index | 0.00 |
Lab-based testing is one of the key methods employed for evaluating web site usability. Yet the artificial conditions of the setting, including surveillance and stylized tasks, can distort user behavior and limit the data that can be obtained. This paper reports on the effectiveness of this standard method compared against two complementary methods which involve more natural, user-driven evaluation contexts, namely, pre-session homework assignments and online usability testing. Using illustrations from recent studies of online shopping sites we detail the advantages and limitations of each method and claim that employing them in combination could improve the quantity and quality of findings. We then propose that future work should focus on optimizing this combined method through sequencing, so that one evaluation approach would inform the design of subsequently used ones.