The cognitive walkthrough method: a practitioner's guide
Usability inspection methods
Usability Engineering
Systematic evaluation of e-learning systems: an experimental validation
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
ACM Transactions on Computer-Human Interaction (TOCHI)
Handbook of Usability TestingXXX: Howto Plan, Design, and Conduct Effective Tests
Handbook of Usability TestingXXX: Howto Plan, Design, and Conduct Effective Tests
Usability Evaluation of a Learning Management System
HICSS '10 Proceedings of the 2010 43rd Hawaii International Conference on System Sciences
Quality of web usability evaluation methods: an empirical study on MiLE+
WISE'07 Proceedings of the 2007 international conference on Web information systems engineering
SUE inspection: an effective method for systematic usability evaluation of hypermedia
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
In this paper we describe a study in which we assessed the effects of the usability of a teaching system designed for distance learning in the context of different types of multitasking activities. The learning performance of six groups of students has been compared after their individual interaction with a system that was either usable or not, and in conditions of simple learning, sequential multitasking or concurrent multitasking. Results show that learning processes are negatively affected by the use of a system that is difficult to use. In addition, learning in multitasking conditions appears to be a difficult task only when students have to acquire new information while doing something else at the same time (concurrent multitasking). The usability levels of the system do not seem to interact with the multitasking modality of learning.