Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability inspection methods
The cognitive walkthrough method: a practitioner's guide
Usability inspection methods
The user action framework: a reliable foundation for usability engineering support tools
International Journal of Human-Computer Studies
A survey of user-centered design practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
Reconditioned merchandise: extended structured report formats in usability inspection
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Instant data analysis: conducting usability evaluations in a day
Proceedings of the third Nordic conference on Human-computer interaction
Comparing usability problems and redesign proposals as input to practical systems development
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Supporting problem identification in usability evaluations
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
What do usability evaluators do in practice?: an explorative study of think-aloud testing
DIS '06 Proceedings of the 6th conference on Designing Interactive systems
Usability problem description and the evaluator effect in usability testing
Usability problem description and the evaluator effect in usability testing
User sketches: a quick, inexpensive, and effective way to elicit more reflective user feedback
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Usability professionals-current practices and future development
Interacting with Computers
Heuristic evaluation: Comparing ways of finding and reporting usability problems
Interacting with Computers
Supporting novice usability practitioners with usability engineering tools
International Journal of Human-Computer Studies
How to bring HCI research and practice closer together
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Work-domain knowledge in usability evaluation: Experiences with Cooperative Usability Testing
Journal of Systems and Software
Analysis in usability evaluations: an exploratory study
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Outliers in usability testing: how to treat usability problems found for only one test participant?
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Hi-index | 0.01 |
Analysis is a key part of conducting usability evaluations, yet rarely systematically studied. Thus, we lack direction on how to do research on supporting practitioners' analysis and lose an opportunity for practitioners to learn from each other. We have surveyed 155 usability practitioners on the analysis in their latest usability evaluation. Analysis is typically flexible and light-weight. At the same time, practitioners see a need to strengthen reliability in evaluation. Redesign is closely integrated with analysis; more than half of the respondents provide visual redesign suggestions in their evaluation deliverables. Analysis support from academic research, including tools, forms and structured formats, does not seem to have direct impact on analysis practice. We provide six recommendations for future research to better support analysis.