Improving student performance by evaluating how well students test their own programs
Journal on Educational Resources in Computing (JERIC)
Methods and tools for exploring novice compilation behaviour
Proceedings of the second international workshop on Computing education research
Web-scale data gathering with BlueJ
Proceedings of the ninth annual international conference on International computing education research
Recording and analyzing in-browser programming sessions
Proceedings of the 13th Koli Calling International Conference on Computing Education Research
Using CodeBrowser to seek differences between novice programmers
Proceedings of the 45th ACM technical symposium on Computer science education
Hi-index | 0.00 |
Many systems collect snapshots of student work, typically after each compile, save, or submission. Visualizing these work histories can yield valuable insights into patterns of student work, common error patterns, and so on. Currently each system must provide its own data visualization system, or must rely on external tools, such as R or Excel. However, the actual values stored in snapshot-based data sets are similar between systems, and, given agreement upon conventions, could be visualized with a common system. We have built a prototype web service called Snapviz. Snapviz displays snapshot data from tab-delimited data files uploaded by users.