Software engineering metrics and models
Software engineering metrics and models
Database models and managerial institution: 50% model + 50% manager
Management Science
Method to estimate parameter values in software prediction models
Information and Software Technology - Information and software economics
Timid choices and bold forecasts: a cognitive perspective on risk taking
Management Science
Robust regression for developing software estimation models
Journal of Systems and Software
Software project survival guide
Software project survival guide
An experimental study of individual subjective effort estimation and combinations of the estimates
Proceedings of the 20th international conference on Software engineering
A Controlled Experiment to Assess the Benefits of Estimating with Analogy and Regression Models
IEEE Transactions on Software Engineering
A Simulation Tool for Efficient Analogy Based Cost Estimation
Empirical Software Engineering
Integrating Risk Assessment with Cost Estimation
IEEE Software
METRICS '99 Proceedings of the 6th International Symposium on Software Metrics
Measurement, Prediction and Risk Analysis for Web Applications
METRICS '01 Proceedings of the 7th International Symposium on Software Metrics
Realism in Assessment of Effort Estimation Uncertainty: It Matters How You Ask
IEEE Transactions on Software Engineering
Regression Models of Software Development Effort Estimation Accuracy and Bias
Empirical Software Engineering
An Empirical Study of Software Project Bidding
IEEE Transactions on Software Engineering
Anchoring and adjustment in software estimation
Proceedings of the 10th European software engineering conference held jointly with 13th ACM SIGSOFT international symposium on Foundations of software engineering
A Survey of Controlled Experiments in Software Engineering
IEEE Transactions on Software Engineering
Evidence-Based Guidelines for Assessment of Software Development Cost Uncertainty
IEEE Transactions on Software Engineering
Software project economics: a roadmap
FOSE '07 2007 Future of Software Engineering
Characteristics of software engineers with optimistic predictions
Journal of Systems and Software
Combining probabilistic models for explanatory productivity estimation
Information and Software Technology
Any other cost estimation inhibitors?
Proceedings of the Second ACM-IEEE international symposium on Empirical software engineering and measurement
Using uncertainty as a model selection and comparison criterion
PROMISE '09 Proceedings of the 5th International Conference on Predictor Models in Software Engineering
The effects of request formats on judgment-based effort estimation
Journal of Systems and Software
A checklist for integrating student empirical studies with research and teaching goals
Empirical Software Engineering
Information and Software Technology
Exploring the human and organizational aspects of software cost estimation
Proceedings of the 2010 ICSE Workshop on Cooperative and Human Aspects of Software Engineering
Monetary pricing of software development risks: A method and empirical illustration
Journal of Systems and Software
Local bias and its impacts on the performance of parametric estimation models
Proceedings of the 7th International Conference on Predictive Models in Software Engineering
Information and Software Technology
COCOMO-U: an extension of COCOMO II for cost estimation with uncertainty
SPW/ProSim'06 Proceedings of the 2006 international conference on Software Process Simulation and Modeling
Investigating intentional distortions in software cost estimation - An exploratory study
Journal of Systems and Software
EASE'08 Proceedings of the 12th international conference on Evaluation and Assessment in Software Engineering
ACM Transactions on Software Engineering and Methodology (TOSEM)
Hi-index | 0.00 |
The uncertainty of a software development effort estimate can be indicated through a prediction interval (PI), i.e., the estimated minimum and maximum effort corresponding to a specific confidence level. For example, a project manager may be ''90% confident'' or believe that is it ''very likely'' that the effort required to complete a project will be between 8000 and 12,000 work-hours. This paper describes results from four studies (Studies A-D) on human judgement (expert) based PIs of software development effort. Study A examines the accuracy of the PIs in real software projects. The results suggest that the PIs were generally much too narrow to reflect the chosen level of confidence, i.e., that there was a strong over-confidence. Studies B-D try to understand the reasons for the observed over-confidence. Study B examines the possibility that the over-confidence is related to type of experience or estimation process. Study C examines the possibility that the concept of confidence level is difficult to interpret for software estimators. Finally, Study D examines the possibility that there are unfortunate feedback mechanisms that reward over-confidence.