Systematic review: A systematic review of effect size in software engineering experiments

  • Authors:
  • Vigdis By Kampenes;Tore Dybå;Jo E. Hannay;Dag I. K. Sjøberg

  • Affiliations:
  • Department of Software Engineering, Simula Research Laboratory, P.O. Box 134, NO-1325 Lysaker, Norway and Department of Informatics, University of Oslo, P.O. Box 1080 Blindern, NO-0316 Oslo, Norwa ...;Department of Software Engineering, Simula Research Laboratory, P.O. Box 134, NO-1325 Lysaker, Norway and SINTEF ICT, NO-7465 Trondheim, Norway;Department of Software Engineering, Simula Research Laboratory, P.O. Box 134, NO-1325 Lysaker, Norway and Department of Informatics, University of Oslo, P.O. Box 1080 Blindern, NO-0316 Oslo, Norwa ...;Department of Software Engineering, Simula Research Laboratory, P.O. Box 134, NO-1325 Lysaker, Norway and Department of Informatics, University of Oslo, P.O. Box 1080 Blindern, NO-0316 Oslo, Norwa ...

  • Venue:
  • Information and Software Technology
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

An effect size quantifies the effects of an experimental treatment. Conclusions drawn from hypothesis testing results might be erroneous if effect sizes are not judged in addition to statistical significance. This paper reports a systematic review of 92 controlled experiments published in 12 major software engineering journals and conference proceedings in the decade 1993-2002. The review investigates the practice of effect size reporting, summarizes standardized effect sizes detected in the experiments, discusses the results and gives advice for improvements. Standardized and/or unstandardized effect sizes were reported in 29% of the experiments. Interpretations of the effect sizes in terms of practical importance were not discussed beyond references to standard conventions. The standardized effect sizes computed from the reviewed experiments were equal to observations in psychology studies and slightly larger than standard conventions in behavioral science.