Remote evaluation: the network as an extension of the usability laboratory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Remote usability evaluation: can users report their own critical incidents?
CHI 98 Cconference Summary on Human Factors in Computing Systems
The evaluator effect in usability tests
CHI 98 Cconference Summary on Human Factors in Computing Systems
Measuring usability: are effectiveness, efficiency, and satisfaction really correlated?
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The state of the art in automating usability evaluation of user interfaces
ACM Computing Surveys (CSUR)
Human-Computer Interaction
Testing web sites: five users is nowhere near enough
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Remote evaluation for post-deployment usability improvement
AVI '98 Proceedings of the working conference on Advanced visual interfaces
Instant data analysis: conducting usability evaluations in a day
Proceedings of the third Nordic conference on Human-computer interaction
Comparing usability problems and redesign proposals as input to practical systems development
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Current practice in measuring usability: Challenges to usability studies and research
International Journal of Human-Computer Studies
What do usability evaluators do in practice?: an explorative study of think-aloud testing
DIS '06 Proceedings of the 6th conference on Designing Interactive systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Making use of business goals in usability evaluation: an experiment with novice evaluators
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of techniques for matching of usability problem descriptions
Interacting with Computers
A survey of software learnability: metrics, methodologies and guidelines
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Supporting novice usability practitioners with usability engineering tools
International Journal of Human-Computer Studies
Interactive usability instrumentation
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
Usability testing in real context of use: the user-triggered usability testing
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Hi-index | 0.00 |
We present DUE (Distributed Usability Evaluation), a technique for collecting and evaluating usability data. The DUE infrastructure involves a client-server network. A client-based tool resides on the workstation of each user, providing a screen video recording, microphone input of voice commentary, and a window for a severity rating. The idea is for the user to work naturalistically, clicking a button when a usability problem or point of uncertainty is encountered, to describe it verbally along with illustrating it on screen, and to rate its severity. These incidents are accumulated on a server, providing access to an evaluator (usability expert) and to product developers or managers who want to review the incidents and analyse them. DUE supports evaluation in the development stages from running prototypes and onwards. A case study of the use of DUE in a corporate environment is presented. The study indicates that the DUE technique is effective in terms of low bias, high efficiency, and clear communication of usability issues among users, evaluators and developers. Further, DUE is supporting long-term evaluations making possible empirical studies of learnability.