The capability maturity model: guidelines for improving the software process
The capability maturity model: guidelines for improving the software process
Introduction to the personal software process
Introduction to the personal software process
A Discipline for Software Engineering
A Discipline for Software Engineering
Software Inspection
Using A Defined and Measured Personal Software Process
IEEE Software
Human-Oriented Improvement in the Software Process
EWSPT '96 Proceedings of the 5th European Workshop on Software Process Technology
Personal Software Process: A User's Perspective
CSEE '96 Proceedings of the 9th Conference on Software Engineering Education
Implementing concepts from the Personal Software Process in an industrial setting
ICSP '96 Proceedings of the Fourth International Conference on the Software Process (ICSP '96)
Empirically Guided Software Effort Guesstimation
IEEE Software
Teaching the PSP: Challenges and Lessons Learned
IEEE Software
Annals of cases on information technology
Measures for mobile users: an architecture
Journal of Systems Architecture: the EUROMICRO Journal - Special issue: Adaptable system/Software architectures
Deploying, updating, and managing tools for collecting software metrics
Proceedings of the 2004 workshop on Quantitative techniques for software agile process
Managing non-invasive measurement tools
Journal of Systems Architecture: the EUROMICRO Journal - Special issue: AGILE methodologies for software production
Ultra-automation and ultra-autonomy for software engineering management of ultra-large-scale systems
ULS '07 Proceedings of the International Workshop on Software Technologies for Ultra-Large-Scale Systems
An analysis of developers' tasks using low-level, automatically collected data
Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering
An analysis of developers' tasks using low-level, automatically collected data
The 6th Joint Meeting on European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering: companion papers
Teaching disciplined software development
Journal of Systems and Software
Data sets and data quality in software engineering
Proceedings of the 4th international workshop on Predictor models in software engineering
ICSE '09 Proceedings of the 31st International Conference on Software Engineering
We need more coverage, stat! classroom experience with the software ICU
ESEM '09 Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement
Evaluation of new software engineering methodologies
XP'03 Proceedings of the 4th international conference on Extreme programming and agile processes in software engineering
ICSP'08 Proceedings of the Software process, 2008 international conference on Making globally distributed software development a success story
Incorporating software agents in automated personal software process (PSP) tools
ISCIT'09 Proceedings of the 9th international conference on Communications and information technologies
Assessing PSP effect in training disciplined software development: A Plan-Track-Review model
Information and Software Technology
Data quality: cinderella at the software metrics ball?
Proceedings of the 2nd International Workshop on Emerging Trends in Software Metrics
Analyzing tool usage to understand to what extent experts change their activities when mentoring
Proceedings of the 2nd International Workshop on Emerging Trends in Software Metrics
Hi-index | 0.00 |
The Personal SoftwareProcess (PSP) is used by software engineers to gather and analyzedata about their work. Published studies typically use data collectedusing the PSP to draw quantitative conclusions about its impactupon programmer behavior and product quality. However, our experienceusing PSP led us to question the quality of data both duringcollection and its later analysis. We hypothesized that dataquality problems can make a significant impact upon the valueof PSP measures—significant enough to lead to incorrectconclusions regarding process improvement. To test this hypothesis,we built a tool to automate the PSP and then examined 89 projectscompleted by ten subjects using the PSP manually in an educationalsetting. We discovered 1539 primary errors and categorized themby type, subtype, severity, and age. To examine the collectionproblem we looked at the 90 errors that represented impossiblecombinations of data and at other less concrete anomalies inTime Recording Logs and Defect Recording Logs. To examine theanalysis problem we developed a rule set, corrected the errorsas far as possible, and compared the original and corrected data.We found significant differences for measures such as yield andthe cost-performance ratio, confirming our hypothesis. Our resultsraise questions about the accuracy of manually collected andanalyzed PSP data, indicate that integrated tool support maybe required for high quality PSP data analysis, and suggest thatexternal measures should be used when attempting to evaluatethe impact of the PSP upon programmer behavior and product quality.