ICSE '89 Proceedings of the 11th international conference on Software engineering
Extreme programming explained: embrace change
Extreme programming explained: embrace change
Software assessments, benchmarks, and best practices
Software assessments, benchmarks, and best practices
Agile software development
Measures for Excellence: Reliable Software on Time, within Budget
Measures for Excellence: Reliable Software on Time, within Budget
Test Driven Development: By Example
Test Driven Development: By Example
Assessing test-driven development at IBM
Proceedings of the 25th International Conference on Software Engineering
Test-Driven Development as a Defect-Reduction Practice
ISSRE '03 Proceedings of the 14th International Symposium on Software Reliability Engineering
Exploring Extreme Programming in Context: An Industrial Case Study
ADC '04 Proceedings of the Agile Development Conference
Software engineering practice versus evidence-based software engineering research
REBSE '05 Proceedings of the 2005 workshop on Realising evidence-based software engineering
Hi-index | 0.00 |
Many factors influence quality data obtained from industrial case studies making comparisons difficult. In this paper, two longitudinal industrial case study experiences are shared which illustrate the complications that can arise. The first is a case study of an IBM team that transitioned to the use of test-driven development. The primary quality measure was functional verification test defects normalized by lines of code. The second case study was performed with an Extreme Programming team at Sabre Airline Solutions. Both test defects and field defects were compared. In both case studies, differences existed which made the comparisons indicative but not absolute.