JHAVÉ—an environment to actively engage students in Web-based algorithm visualizations
Proceedings of the thirty-first SIGCSE technical symposium on Computer science education
Experiences in automatic assessment on mass courses and issues for designing virtual courses
Proceedings of the 7th annual conference on Innovation and technology in computer science education
Algorithm Explanation: Visualizing Abstract States and Invariants
Revised Lectures on Software Visualization, International Seminar
The "Authoring on the Fly" system for automated recording and replay of (tele)presentations
Multimedia Systems - Special issue: Multimedia authoring and presentation techniques
Exploring the role of visualization and engagement in computer science education
Working group reports from ITiCSE on Innovation and technology in computer science education
Proceedings of the 11th annual SIGCSE conference on Innovation and technology in computer science education
Learner interaction with algorithm visualizations: viewing vs. changing vs. constructing
Proceedings of the 11th annual SIGCSE conference on Innovation and technology in computer science education
Preface to the special issue on automated assessment of programming assignments
Journal on Educational Resources in Computing (JERIC)
Individualized exercises for self-assessment of programming knowledge: An evaluation of QuizPACK
Journal on Educational Resources in Computing (JERIC)
Integrating test generation functionality into the Teaching Machine environment
Electronic Notes in Theoretical Computer Science (ENTCS)
Adding Test Generation to the Teaching Machine
ACM Transactions on Computing Education (TOCE) - Special Issue on the 5th Program Visualization Workshop (PVW’08)
User-adaptive explanatory program visualization: evaluation and insights from eye movements
User Modeling and User-Adapted Interaction
JSAV: the JavaScript algorithm visualization library
Proceedings of the 18th ACM conference on Innovation and technology in computer science education
Hi-index | 0.00 |
Students' understanding of algorithms and data structures can be assessed with the help of construction tasks where students have to build their own visualizations. It has been claimed that active construction of visualizations results in better learning outcomes than passive viewing or merely changing the input of a visualized algorithm. This paper presents a system for the generation, execution, and evaluation of construction tasks. Its key feature is the flexibility in all three stages, ranging from fully automated to fully manual generation and evaluation as well as several different types of automatic feedback during the execution phase. Besides its use in daily teaching, the system can serve as a test bed for evaluations regarding the effectiveness of visualizations in the learning process.