Domain-specific languages: an annotated bibliography
ACM SIGPLAN Notices
Xtext: implement your language faster than the quick and dirty way
Proceedings of the ACM international conference companion on Object oriented programming systems languages and applications companion
Activity-based learner-models for learner monitoring and recommendations in Moodle
EC-TEL'11 Proceedings of the 6th European conference on Technology enhanced learning: towards ubiquitous learning
SNAPP: a bird's-eye view of temporal participant interaction
Proceedings of the 1st International Conference on Learning Analytics and Knowledge
The big five and visualisations of team work activity
ITS'06 Proceedings of the 8th international conference on Intelligent Tutoring Systems
An Ontology-Based Approach for Sharing and Analyzing Learning Trace Corpora
ICSC '12 Proceedings of the 2012 IEEE Sixth International Conference on Semantic Computing
Harvesting models from web 2.0 databases
Software and Systems Modeling (SoSyM)
Hi-index | 0.00 |
The focus on assessment of learning experiences has shifted from knowledge to competences. Unfortunately, assessing certain competences is mainly a subjective task, being problematic for both the evaluators and the evaluated. Additionally, when the learning process is computer-supported and the number of students increases, traditional assessment procedures suffer from scalability problems. In this paper we introduce a system that supports grading learning competences according to students' performance in an online course. We automatically extract different objective indicators about students' work in a Learning Management System (LMS). Evaluators can use an assessment-specific query language to express a number of required indicators. Such indicators are automatically extracted from the activity logs generated by the LMS. A case study with Moodle LMS-based courses is carried out to explain how such indicators can be obtained and how to analyze the assessment results.