Designing for usability: key principles and what designers think
Communications of the ACM
User-derived impact analysis as a tool for usability engineering
CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Communications of the ACM - Special issue Participatory Design
Connecting theory and practice: a case study of achieving usability goals
CHI '85 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Measuring usability: are effectiveness, efficiency, and satisfaction really correlated?
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
On the contributions of different empirical data in usability testing
DIS '00 Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques
The design response to usability test findings: a case study based on artifacts and interviews
IPCC/SIGDOC '00 Proceedings of IEEE professional communication society international professional communication conference and Proceedings of the 18th annual ACM international conference on Computer documentation: technology & teamwork
User Centered System Design; New Perspectives on Human-Computer Interaction
User Centered System Design; New Perspectives on Human-Computer Interaction
Evaluating usability methods: why the current literature fails the practitioner
interactions - The digital muse: HCI in support of creativity
Two psychology-based usability inspection techniques studied in a diary experiment
Proceedings of the third Nordic conference on Human-computer interaction
Cost-Justifying Usability: An Update for the Internet Age
Cost-Justifying Usability: An Update for the Internet Age
Comparing usability problems and redesign proposals as input to practical systems development
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Personal and Ubiquitous Computing
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Evaluating user interfaces with metaphors of human thinking
ERCIM'02 Proceedings of the User interfaces for all 7th international conference on Universal access: theoretical perspectives, practice, and experience
The usability perspective framework
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Journal of Biomedical Informatics
Work-domain knowledge in usability evaluation: Experiences with Cooperative Usability Testing
Journal of Systems and Software
Usability evaluation in software development practice
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part IV
Exploring the usability of web portals: A Croatian case study
International Journal of Information Management: The Journal for Information Professionals
Journal of Systems and Software
Hi-index | 0.00 |
Analyzing usability improvement processes as they take place in real-life organizations is necessary to understand the practice of usability work. This paper describes a case study where the usability of an information system is improved and a relationship between the improvements and the evaluation efforts is established. Results show that evaluation techniques complemented each other by suggesting different kinds of usability improvement. Among the techniques applied, a combination of questionnaires and Metaphors of Human Thinking (MOT) showed the largest mean impact and MOT produced the largest number of impacts. Logging of real-life use of the system over 6 months indicated six aspects of improved usability, where significant differences among evaluation techniques were found. Concerning five of the six aspects Think Aloud evaluations and the above-mentioned combination of questionnaire and MOT performed equally well, and better than MOT. Based on the evaluations 40 redesign proposals were developed and 30 of these were implemented. Four of the implemented redesigns where considered especially important. These evolved with inspiration from multiple evaluations and were informed by stakeholders with different kinds of expertise. Our results suggest that practitioners should not rely on isolated evaluations. Instead complementing techniques should be combined, and people with different expertise should be involved.