Proceedings of the 34th International Conference on Software Engineering
Tracking human-centric controlled experiments with biscuit
Proceedings of the ACM 4th annual workshop on Evaluation and usability of programming languages and tools
Understanding Ajax applications by connecting client and server-side execution traces
Empirical Software Engineering
Answering software evolution questions: An empirical evaluation
Information and Software Technology
On the impact of trace-based feature location in the performance of software maintainers
Journal of Systems and Software
A novel requirement analysis approach for periodic control systems
Frontiers of Computer Science: Selected Publications from Chinese Universities
Hi-index | 0.00 |
Software maintenance activities require a sufficient level of understanding of the software at hand that unfortunately is not always readily available. Execution trace visualization is a common approach in gaining this understanding, and among our own efforts in this context is Extravis, a tool for the visualization of large traces. While many such tools have been evaluated through case studies, there have been no quantitative evaluations to the present day. This paper reports on the first controlled experiment to quantitatively measure the added value of trace visualization for program comprehension. We designed eight typical tasks aimed at gaining an understanding of a representative subject system, and measured how a control group (using the Eclipse IDE) and an experimental group (using both Eclipse and Extravis) performed these tasks in terms of time spent and solution correctness. The results are statistically significant in both regards, showing a 22 percent decrease in time requirements and a 43 percent increase in correctness for the group using trace visualization.