Client sponsored projects in software engineering courses
SIGCSE '03 Proceedings of the 34th SIGCSE technical symposium on Computer science education
Evaluating student teams developing unique industry projects
ACE '05 Proceedings of the 7th Australasian conference on Computing education - Volume 42
Self and peer assessment in software engineering projects
ACE '05 Proceedings of the 7th Australasian conference on Computing education - Volume 42
Competency matrices for peer assessment of individuals in team projects
Proceedings of the 6th conference on Information technology education
Data Engineering education with real-world projects
ACM SIGCSE Bulletin
The use of community-based non-profit organizations in information systems capstone projects
Proceedings of the 11th annual SIGCSE conference on Innovation and technology in computer science education
Assessment of individuals on CS group projects
Journal of Computing Sciences in Colleges
Developing realistic capstone projects in conjunction with industry
Proceedings of the 8th ACM SIGITE conference on Information technology education
Revitalizing computing education through free and open source software for humanity
Communications of the ACM - A Blind Person's Interaction with Technology
Journal of Computing Sciences in Colleges
Hi-index | 0.00 |
Collaborative and experiential learning has many proven merits. Team projects with real clients motivate students to put in the time for successfully completing demanding projects. However, assessing student performance where individual student contributions are separated from the collective contribution of the team as a whole is not a straightforward, simple task. Assessment data from multiple sources, including students as assessors of their own work and peers' work, is critical to measuring certain student learning outcomes, such as responsible team work and timely communication. In this paper we present our experience with assessing collaborative and experiential learning in five Computer Information Systems courses. The courses were scheduled over three semesters and enrolled 57 students. Student performance and student feedback data were used to evaluate and refine our assessment methodology. We argue that assessment data analysis improved our understanding of (1) the assessment measures that support more closely targeted learning outcomes and (2) how those measures should be implemented.