User interface evaluation in the real world: a comparison of four techniques
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mapping the method muddle: guidance in using methods for user interface design
Proceedings of a workshop on Human-computer interface design : success stories, emerging methods, and real-world context: success stories, emerging methods, and real-world context
Exploration and experienced performance with display-based systems
Exploration and experienced performance with display-based systems
Turning research into practice: characteristics of display-based interaction
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using GOMS for user interface design and evaluation: which technique?
ACM Transactions on Computer-Human Interaction (TOCHI)
The GOMS family of user interface analysis techniques: comparison and contrast
ACM Transactions on Computer-Human Interaction (TOCHI)
Evaluating a multimedia authoring tool
Journal of the American Society for Information Science - Special issue on current research in human-computer interaction
Introduction to this special issue on cognitive architectures and human-computer interaction
Human-Computer Interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A comparative study of two usability evaluation methods using a web-based e-learning application
Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries
Understanding User Centred Design (UCD) for People with Special Needs
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
Introduction to this special issue on experimental comparisons of usability evaluation methods
Human-Computer Interaction
Validity and cross-validity in HCI publications
DSVIS'06 Proceedings of the 13th international conference on Interactive systems: Design, specification, and verification
Interaction walkthrough: evaluation of safety critical interactive systems
DSVIS'06 Proceedings of the 13th international conference on Interactive systems: Design, specification, and verification
Hi-index | 0.00 |
Our goal in writing "Damaged Merchandise?" (DM) was not to have the last word on the subject but to raise an awareness within the human-computer interaction (HCI) community of issues that we felt had been too long ignored or neglected. On reading the 10 commentaries from distinguished members of the HCI community, we were pleased to see that they had joined the debate and broadened the discussion. Subsequently, we were somewhat torn by how to proceed. Our first thought was to respond point by point, commentary by commentary. However, we refrain from addressing many specific issues here, as a full discussion would involve an article at least as long as DM. Instead we focus on a few important themes that emerged throughout our article and the ensuing discussion: •What is usability, how do we measure it, and what do we need to know about our usability evaluation methods (UEMs)? •Why do we find ourselves where we are? •What is the role of experiments versus other empirical studies in HCI? Are there common issues in the design of empirical studies? •How do we judge the value of a study? •Where do we go from here?