Protocols in the use of empirical software engineering artifacts
Empirical Software Engineering
Empirical studies to build a science of computer science
Communications of the ACM
ICSE '09 Proceedings of the 31st International Conference on Software Engineering
Operational definition and automated inference of test-driven development with Zorro
Automated Software Engineering
We need more coverage, stat! classroom experience with the software ICU
ESEM '09 Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement
Tool supported detection and judgment of nonconformance in process execution
ESEM '09 Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement
Jasmine: a PSP supporting tool
ICSP'07 Proceedings of the 2007 international conference on Software process
Saros: an eclipse plug-in for distributed party programming
Proceedings of the 2010 ICSE Workshop on Cooperative and Human Aspects of Software Engineering
Tool support for personal software process
PROFES'05 Proceedings of the 6th international conference on Product Focused Software Process Improvement
SPW/ProSim'06 Proceedings of the 2006 international conference on Software Process Simulation and Modeling
Proceedings of the 18th ACM conference on Innovation and technology in computer science education
Hi-index | 0.02 |
Measurement definition, collection, and analysis is an essential component of high quality software engineering practice, and is thus an essential component of the software engineering curriculum. However, providing students with practical experience with measurement in a classroom setting can be so time-consuming and intrusive that itýs counter-productive 驴 teaching students that software measurement is "impractical" for many software development contexts. In this research, we designed and evaluated a very low-overhead approach to measurement collection and analysis using the Hackystat system with special features for classroom use. We deployed this system in two software engineering classes at the University of Hawaii during Fall, 2003, and collected quantitative and qualitative data to evaluate the effectiveness of the approach. Results indicate that the approach represents substantial progress toward practical, automated metrics collection and analysis, though issues relating to the complexity of installation and privacy of user data remain.