A behavioral approach to information retrieval system design
Journal of Documentation
Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User interface evaluation in the real world: a comparison of four techniques
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An automated cognitive walkthrough
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using programming walkthroughs to design a visual language
Using programming walkthroughs to design a visual language
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The cognitive jogthrough: a fast-paced user interface evaluation procedure
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The pragmatics of information retrieval experimentation, revisited
Information Processing and Management: an International Journal - Special issue on evaluation issues in information retrieval
Braque: design of an interface to support user interaction in information retrieval
Information Processing and Management: an International Journal - Special issue on hypertext and information retrieval
The cognitive walkthrough method: a practitioner's guide
Usability inspection methods
Transferring HCI modelling and design techniques to practitioners: a framework and empirical work
HCI '94 Proceedings of the conference on People and computers IX
HCI '94 Proceedings of the conference on People and computers IX
Learning and using the cognitive walkthrough method: a case study approach
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A comparison of usability techniques for evaluating design
DIS '97 Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques
Evaluating a multimedia authoring tool
Journal of the American Society for Information Science - Special issue on current research in human-computer interaction
The evaluator effect in usability tests
CHI 98 Cconference Summary on Human Factors in Computing Systems
Training software engineers in a novel usability evaluation technique
International Journal of Human-Computer Studies
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
Modeling the information-seeking behavior of social scientists: Ellis's study revisited
Journal of the American Society for Information Science and Technology
Evaluating usability methods: why the current literature fails the practitioner
interactions - The digital muse: HCI in support of creativity
Analytical usability evaluation for digital libraries: a case study
Proceedings of the 4th ACM/IEEE-CS joint conference on Digital libraries
Information Processing and Management: an International Journal
Evaluating system utility and conceptual fit using CASSM
International Journal of Human-Computer Studies
Introduction to this special issue on experimental comparisons of usability evaluation methods
Human-Computer Interaction
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Journal of the American Society for Information Science and Technology
Evaluating advanced search interfaces using established information-seeking models
Journal of the American Society for Information Science and Technology
Hi-index | 0.00 |
The importance of user-centred evaluation is stressed by HCI academics and practitioners alike. However, there have been few recent evaluation studies of User Evaluation Methods (UEMs), especially those with the aim of improving methods rather than assessing their efficacy (i.e. formative rather than summative evaluations). In this article, we present formative evaluations of two new methods for assessing the functionality and usability of a particular type of interactive system-electronic information resources. These serve as an example of an evaluation approach for assessing the success of new HCI methods. We taught the methods to a group of electronic resource developers and collected a mixture of focus group, method usage and summary questionnaire data-all focusing on how useful, usable and learnable the developers perceived the methods to be and how likely they were to use them in the future. Findings related to both methods were generally positive, and useful suggestions for improvement were made. Our evaluation sessions also highlighted a number of trade-offs for the development of UEMs and general lessons learned, which we discuss in order to inform the future development and evaluation of HCI methods.