Experiences of using an evaluation framework

  • Authors:
  • Barbara Kitchenham;Stephen Linkman;Susan Linkman

  • Affiliations:
  • Software Engineering Group, Department of Computer Science, Keele University, Keele Village, Stoke-on-Trent, Staffordshire ST5 5BG, UK;Software Engineering Group, Department of Computer Science, Keele University, Keele Village, Stoke-on-Trent, Staffordshire ST5 5BG, UK;Software Engineering Group, Department of Computer Science, Keele University, Keele Village, Stoke-on-Trent, Staffordshire ST5 5BG, UK

  • Venue:
  • Information and Software Technology
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper reports two trials of an evaluation framework intended to evaluate novel software applications. The evaluation framework was originally developed to evaluate a risk-based software bidding model, and our first trial of using the framework was our evaluation of the bidding model. We found that the framework worked well as a validation framework but needed to be extended before it would be appropriate for evaluation. Subsequently, we compared our framework with a recently completed evaluation of a software tool undertaken as part of the Framework V CLARiFi project. In this case, we did not use the framework to guide the evaluation; we used the framework to see whether it would identify any weaknesses in the actual evaluation process. Activities recommended by the framework were not undertaken in the order suggested by the evaluation process and we found problems relating to that oversight surfaced during the tool evaluation activities. Our experiences suggest that the framework has some benefits but it also requires further practical testing.