Comparing Software Prediction Techniques Using Simulation
IEEE Transactions on Software Engineering - Special section on the seventh international software metrics symposium
Human Performance Estimating with Analogy and Regression Models: An Empirical Validation
METRICS '98 Proceedings of the 5th International Symposium on Software Metrics
A Simulation Study of the Model Evaluation Criterion MMRE
IEEE Transactions on Software Engineering
Excel: : COM :: $$\mathsf{R}$$
Computational Statistics
A Systematic Review of Software Development Cost Estimation Studies
IEEE Transactions on Software Engineering
Software project economics: a roadmap
FOSE '07 2007 Future of Software Engineering
Comparing cost prediction models by resampling techniques
Journal of Systems and Software
Data sets and data quality in software engineering
Proceedings of the 4th international workshop on Predictor models in software engineering
Empirical evaluation of analogy-x for software cost estimation
Proceedings of the Second ACM-IEEE international symposium on Empirical software engineering and measurement
Why comparative effort prediction studies may be invalid
PROMISE '09 Proceedings of the 5th International Conference on Predictor Models in Software Engineering
Visual comparison of software cost estimation models by regression error characteristic analysis
Journal of Systems and Software
Empirical Software Engineering
Special issue on repeatable results in software engineering prediction
Empirical Software Engineering
Hi-index | 0.00 |
Background: During the previous decades there has been noted a significantly increased research interest on the construction of prediction models for accurate estimation of software cost. Despite the development of sophisticated methodologies, there is a continuous debate concerning the divergent and controversial conclusions of the related literature. Nowadays, due to this fact, the research community attempts to systematically base the whole comparison and evaluation process on formal frameworks and structured guidelines in concordance with modern statistical practices and methodologies, so as to resolve the problem of inconsistent findings. Aims: Towards this direction, we present StatREC, a Graphical User Interface, which facilitates the visualization and hypothesis testing of error distributions through their graphical representation as REC curves. Conclusions: The advantage of StatREC is that it provides to the non-expert user a robust, highly interactive, rich in graphics and easily interpretable way to perform comparisons among alternative models. The goal of StatREC is to support project managers during the decision-making process on the cost of software development.