Quality of student contributed questions using PeerWise

  • Authors:
  • Paul Denny;Andrew Luxton-Reilly;Beth Simon

  • Affiliations:
  • University of Auckland, Auckland, New Zealand;University of Auckland, Auckland, New Zealand;University of California, San Diego, La Jolla, CA

  • Venue:
  • ACE '09 Proceedings of the Eleventh Australasian Conference on Computing Education - Volume 95
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

PeerWise is an online tool that involves students in the process of creating, sharing, answering and discussing multiple choice questions. Previous work has shown that students voluntarily use the large repository of questions developed by their peers as a source of revision for formal examinations -- and activity level correlates with improved exam performance. In this paper, we investigate the quality of the questions created by students in a large introductory programming course. The ability of students to assess question quality is also examined. We find that students do, very commonly, ask clear questions that are free from error and give the correct answers. Of the few questions we examined that contained errors, in all cases those errors were detected, and corrected by other students. We also report that students are effective judges of question quality, and are willing to use the judgements of their peers to decide which questions to answer. We include several case studies of questions that are representative of the kinds of questions in the repository and provide insight for instructors considering use of PeerWise in their classrooms.