CHI '94 Conference Companion on Human Factors in Computing Systems
Interaction Design
The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications
Current practice in measuring usability: Challenges to usability studies and research
International Journal of Human-Computer Studies
Staying open to interpretation: engaging multiple meanings in design and evaluation
DIS '06 Proceedings of the 6th conference on Designing Interactive systems
Observing the User Experience: A Practitioner's Guide to User Research (Morgan Kaufmann Series in Interactive Technologies) (The Morgan Kaufmann Series in Interactive Technologies)
Human-Computer Interaction (3rd Edition)
Human-Computer Interaction (3rd Edition)
Usability evaluation considered harmful (some of the time)
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
From Mice to Men - 24 Years of Evaluation in CHI
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Throughout the history of HCI, interviews have been utilized for collecting users' subjective evaluations of interactive technology. This paper raises the issue that these interviews are often deployed in a manner overlooking two aspects of evaluation: the relative positions from which the system is evaluated and the interviewees' interpretations of the system. In the study, 14 users of a new information system were asked to evaluate provocative claims about the system's usability. The analyses of their responses reveal two sources of variation: what is being evaluated and who is evaluating it. Interviewees evaluated the system's usability from five user positions: end user, supervisor, organization's representative, co-developer, and outsider. Also, four "faces" of the system were interpreted: UI, utility, communication medium, and unknown entity. These findings are employed for drawing of broader conclusions about the system and its use, and procedures for improving user interviews in HCI are presented.