Development of an instrument measuring user satisfaction of the human-computer interface
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of four subjective workload rating scales
Human Factors - Special issue: measurement in human factors
Using small screen space more efficiently
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User Interface Design in the Post-PC Era
Proceedings of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces-Volume I - Volume I
The National Classroom Oroject-an experience report
FIE '00 Proceedings of the 30th Annual Frontiers in Education - Volume 01
Handheld devices for cooperative educational activities
Proceedings of the 2006 ACM symposium on Applied computing
Learning object design considerations for small-screen handheld devices
Computers & Education
The design and evaluation of a computerized adaptive test on mobile devices
Computers & Education
Multi-purpose proactive m-Artifacts
Proceedings of the 2008 ACM symposium on Applied computing
SMS enhanced vocabulary learning for mobile audiences
International Journal of Mobile Learning and Organisation
Supporting the design of mobile interactive artefacts
Advances in Engineering Software
Detecting learning difficulties on ubiquitous scenarios
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: applications and services
Hi-index | 0.00 |
In the last few years, schools and universities have incorporated personal digital assistants (PDAs) into their teaching curricula in an attempt to enhance students' learning experience and reduce instructors' workload. One of the most common uses of PDAs in the classroom is as a test administrator. This study compared the usability effectiveness, efficiency, and satisfaction of a PDA-based quiz application to that of standard paper-and-pencil quizzes in a university course. Effectiveness was measured as students' quiz scores and through a mental workload questionnaire; efficiency was the time it took students to complete each quiz; and satisfaction was evaluated using a subjective user satisfaction questionnaire. The study showed the PDA-based quiz to be more efficient, that is, students completed it in less time than they needed to complete the paper-and-pencil quiz. No differences in effectiveness and satisfaction were found between the two quiz types. Computer anxiety was not affected by the quiz type. For these reasons, as well as other advantages to both students (e.g., real-time scoring) and teachers (e.g., less time spent on grading), PDAs are an attractive test administration option for schools and universities.