Benchmarking Software-Development Productivity
IEEE Software
Trends in Software Process: The PSP and Agile Methods
IEEE Software
Identifying High Performance ERP Projects
IEEE Transactions on Software Engineering
Benchmarking COTS Projects Using Data Envelopment Analysis
METRICS '99 Proceedings of the 6th International Symposium on Software Metrics
Module Size Distribution and Defect Density
ISSRE '00 Proceedings of the 11th International Symposium on Software Reliability Engineering
Introduction to the team software process(sm)
Introduction to the team software process(sm)
Measuring and predicting software productivity: A systematic map and review
Information and Software Technology
Hi-index | 0.00 |
Benchmarking is one of the most important methods to learn the best practices for software process improvement. However, in current software process context, benchmarking is mainly for projects rather than software development tasks. Can we benchmark software development tasks? If so, how to? Moreover, benchmarking software development tasks has to deal with multivariate and variable return to scale (VRS). This paper reports practical experience of benchmarking software development tasks under multivariate and VRS constraints using Data Envelopment Analysis (DEA). The analysis of experience data in Institute of Software, Chinese Academy of Sciences (ISCAS) indicates that the ideas and techniques of benchmarking software projects can be deployed at the software development task level. Moreover, results also show that DEA VRS model allows the developers to gain new insight about how to identify the relatively efficient tasks as the task performance benchmark and how to establish different reference sets for each relatively inefficient task under multivariate and VRS constraints. We thus recommend DEA VRS model be used as the default technique for appropriately benchmarking software development tasks. Our results are beneficial to software process improvement. To the best of our knowledge, we believe that it is the first time to report such comprehensive and repeatable results of benchmarking software development tasks using DEA.