Usability evaluation of an e-learning tutorial: criteria, questions and case study

  • Authors:
  • Ruth De Villiers

  • Affiliations:
  • School of Computing, University of South Africa, P O Box 392, UNISA, 0003, South Africa

  • Venue:
  • SAICSIT '04 Proceedings of the 2004 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Computing systems require rigo rous evaluation both of functionality and usability. Evaluation of software within the growing e-learning sector is currently receiving attention. Relevant aspects are evaluation paradigms, techniques, and the issue of who does the evaluating. Squires and Preece developed a set of 'learning with software' heuristics to be used by experts/educators in predictive evaluation prior to adopting a system. These were adapted for post-production end-user evaluation of an operational e-learning tutorial lesson, Relations, used in Theoretical Computer Science. Findings are given from a questionnaire survey among learners. This process evaluated the artifact, and also reflectively confirmed the utility of the evaluation technique and criteria. Lessons have also been learned for the future development of educational software.