The benefits of laboratory testing for usability
SIGCPR '85 Proceedings of the twenty-first annual conference on Computer personnel research
Development of an instrument measuring user satisfaction of the human-computer interface
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Measuring usability: preference vs. performance
Communications of the ACM
Connecting theory and practice: a case study of achieving usability goals
CHI '85 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The human-computer interaction handbook
Design principles for human-computer interfaces
CHI '83 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Software engineering for user interfaces
ICSE '84 Proceedings of the 7th international conference on Software engineering
Evaluating web sites: exploiting user's expectations
International Journal of Human-Computer Studies - Incorporating knowledge acquisition
Functionality and usability in design for eStatements in eBanking services
Interacting with Computers
A usability comparison of three alternative message formats for an SMS banking service
International Journal of Human-Computer Studies
International Journal of Human-Computer Studies
Interactive visualization for low literacy users: from lessons learnt to design
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Changing perspectives on evaluation in HCI: past, present, and future
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.02 |
This paper reports on a study investigating the strengths and weaknesses of questionnaires as software evaluation tools. Two major influences on the usefulness of questionnaire-based evaluation responses are examined: the administration of the questionnaire, and the background and experience of the respondent. Two questionnaires were administered to a large number of students in an introductory programming class. The questionnaires were also given to a group of more experienced users (including course proctors). Respondents were asked to evaluate the text editor used in the class along a number of dimensions; evaluation responses were solicited using a number of different question types. Another group of students received the questionnaire individually, with part of it presented on the computer; a third group also evaluated an enhanced version of the editor in followup sessions.