A method of automatic grade calibration in peer assessment
ACE '05 Proceedings of the 7th Australasian conference on Computing education - Volume 42
Improving pedagogical feedback and objective grading
Proceedings of the 39th SIGCSE technical symposium on Computer science education
ACM SIGCSE Bulletin
Proceedings of the ACM 2009 international conference on Supporting group work
Quality of peer assessment in CS1
ICER '09 Proceedings of the fifth international workshop on Computing education research workshop
Ten years of the Australasian Computing Education Conference
ACE '09 Proceedings of the Eleventh Australasian Conference on Computing Education - Volume 95
Experiences in teaching quality attribute scenarios
ACE '09 Proceedings of the Eleventh Australasian Conference on Computing Education - Volume 95
Proceedings of the 16th annual conference reports on Innovation and technology in computer science education - working group reports
Proceedings of the 11th Koli Calling International Conference on Computing Education Research
An evaluation of electronic individual peer assessment in an introductory programming course
Koli Calling '07 Proceedings of the Seventh Baltic Sea Conference on Computing Education Research - Volume 88
Hi-index | 0.00 |
Aropä is a web-based peer assessment support tool that has been used extensively in a wide variety of settings over the past three years. We describe the design of Aropä and how it can be configured, and present some results from a research study into the use of peer assessment in large undergraduate courses. There is evidence to show that while students find peer assessment challenging, it can be an effective aid to learning. The study also reveals marked differences in attitude toward peer assessment between different student bodies.