Using PSP to evaluate student effort in achieving learning outcomes in a software engineering assignment

  • Authors:
  • Brian R. von Konsky;Jim Ivins;Mike Robey

  • Affiliations:
  • Curtin University of Technology, Perth, Western Australia;Curtin University of Technology, Perth, Western Australia;Curtin University of Technology, Perth, Western Australia

  • Venue:
  • ACE '05 Proceedings of the 7th Australasian conference on Computing education - Volume 42
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The goal of this study was to measure the effort expended by students during a major assignment in a third year software engineering subject. The purpose was to evaluate whether students were expending effort on activities not related to the stated learning outcomes, and to determine whether the assessment pattern and assignment scope were appropriate. The principal learning outcome was the ability to model system state using the Unified Modelling Language, Ward and Mellor Data Flow Diagrams, and Z. Another outcome was the ability to show that system models expressed in these notations were valid and consistent. Students kept Personal Software Process (PSP)SM logs to record effort expended on all assignment activities. Student opinions regarding learning outcome attainment and the accuracy of PSP data were evaluated using an anonymous questionnaire. A total of 148 students reported spending an average of 24.9 hours working on the assignment and achieved an average mark of 62.6%. Bachelor of Engineering (Software Engineering) students generally achieved a better mark, while expending less effort than Bachelor of Science students studying Computer Science or Information Technology. Surprisingly, however, there was no correlation between effort and mark. Excessive time recorded in the PSP logs of some students, the large standard deviation (s = 12.6 hours), and the large number of outliers in the data suggest that many students either did not take the PSP seriously, or did not use time efficiently and were distracted by factors unrelated to the intended learning outcomes. Other potentially more efficient modes of assessment and feedback are discussed.