Evaluating the Information Behaviour methods: Formative evaluations of two methods for assessing the functionality and usability of electronic information resources

  • Authors:
  • Stephann Makri;Ann Blandford;Anna L. Cox;Simon Attfield;Claire Warwick

  • Affiliations:
  • UCL Interaction Centre, University College London, 8th Floor, Malet Place Engineering Building, Gower Street, London WC1E 6BT, UK;UCL Interaction Centre, University College London, 8th Floor, Malet Place Engineering Building, Gower Street, London WC1E 6BT, UK;UCL Interaction Centre, University College London, 8th Floor, Malet Place Engineering Building, Gower Street, London WC1E 6BT, UK;Interaction Design Centre, School of Computing Science, Middlesex University, The Burroughs, Hendon, London, NW4 4BT, UK;Department of Information Studies, University College London, Gower Street, London WC1E 6BT, UK

  • Venue:
  • International Journal of Human-Computer Studies
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The importance of user-centred evaluation is stressed by HCI academics and practitioners alike. However, there have been few recent evaluation studies of User Evaluation Methods (UEMs), especially those with the aim of improving methods rather than assessing their efficacy (i.e. formative rather than summative evaluations). In this article, we present formative evaluations of two new methods for assessing the functionality and usability of a particular type of interactive system-electronic information resources. These serve as an example of an evaluation approach for assessing the success of new HCI methods. We taught the methods to a group of electronic resource developers and collected a mixture of focus group, method usage and summary questionnaire data-all focusing on how useful, usable and learnable the developers perceived the methods to be and how likely they were to use them in the future. Findings related to both methods were generally positive, and useful suggestions for improvement were made. Our evaluation sessions also highlighted a number of trade-offs for the development of UEMs and general lessons learned, which we discuss in order to inform the future development and evaluation of HCI methods.