Database models and managerial institution: 50% model + 50% manager
Management Science
Predicting (Individual) Software Productivity
IEEE Transactions on Software Engineering
Estimeetings: Development Estimates and a Front-End Process for a Large Project
IEEE Transactions on Software Engineering
Timid choices and bold forecasts: a cognitive perspective on risk taking
Management Science
Cost estimation of software intensive projects: a survey of current practices
ICSE '91 Proceedings of the 13th international conference on Software engineering
Software project survival guide
Software project survival guide
An experimental study of individual subjective effort estimation and combinations of the estimates
Proceedings of the 20th international conference on Software engineering
A Controlled Experiment to Assess the Benefits of Estimating with Analogy and Regression Models
IEEE Transactions on Software Engineering
Software management and cost estimating error
Journal of Systems and Software
Combination of software development effort prediction intervals: why, when and how?
SEKE '02 Proceedings of the 14th international conference on Software engineering and knowledge engineering
A Simulation Tool for Efficient Analogy Based Cost Estimation
Empirical Software Engineering
An empirical study of maintenance and development estimation accuracy
Journal of Systems and Software
Modeling Software Bidding Risks
IEEE Transactions on Software Engineering
On Uncertainty, Ambiguity, and Complexity in Project Management
Management Science
Realism in Assessment of Effort Estimation Uncertainty: It Matters How You Ask
IEEE Transactions on Software Engineering
Group Processes in Software Effort Estimation
Empirical Software Engineering
Regression Models of Software Development Effort Estimation Accuracy and Bias
Empirical Software Engineering
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A review of studies on expert estimation of software development effort
Journal of Systems and Software
Journal of Systems and Software
REBSE '07 Proceedings of the Second International Workshop on Realising Evidence-Based Software Engineering
Expert Systems with Applications: An International Journal
Self-efficacy, overconfidence, and the negative effect on subsequent performance: A field study
Information and Management
The high precision Man-Hours estimated technique in a system proposal phase
International Journal of Computer Applications in Technology
Achieving On-Time Delivery: A Two-Stage Probabilistic Scheduling Strategy for Software Projects
ICSP '09 Proceedings of the International Conference on Software Process: Trustworthy Software Development Processes
A study of the non-linear adjustment for analogy based software cost estimation
Empirical Software Engineering
ESEM '09 Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement
Coping with the cone of uncertainty: an empirical study of the SAIV process model
ICSP'07 Proceedings of the 2007 international conference on Software process
Systematic literature reviews in software engineering - A tertiary study
Information and Software Technology
Refining the systematic literature review process--two participant-observer case studies
Empirical Software Engineering
Local bias and its impacts on the performance of parametric estimation models
Proceedings of the 7th International Conference on Predictive Models in Software Engineering
COCOMO-U: an extension of COCOMO II for cost estimation with uncertainty
SPW/ProSim'06 Proceedings of the 2006 international conference on Software Process Simulation and Modeling
ACM Transactions on Software Engineering and Methodology (TOSEM)
Hi-index | 0.01 |
Several studies suggest that uncertainty assessments of software development costs are strongly biased toward overconfidence, i.e., that software cost estimates typically are believed to be more accurate than they really are. This overconfidence may lead to poor project planning. As a means of improving cost uncertainty assessments, we provide evidence-based guidelines for how to assess software development cost uncertainty, based on results from relevant empirical studies. The general guidelines provided are: 1) Do not rely solely on unaided, intuition-based uncertainty assessment processes, 2) do not replace expert judgment with formal uncertainty assessment models, 3) apply structured and explicit judgment-based processes, 4) apply strategies based on an outside view of the project, 5) combine uncertainty assessments from different sources through group work, not through mechanical combination, 6) use motivational mechanisms with care and only if greater effort is likely to lead to improved assessments, and 7) frame the assessment problem to fit the structure of the relevant uncertainty information and the assessment process. These guidelines are preliminary and should be updated in response to new evidence.