A Replicated Experiment to Assess Requirements InspectionTechniques

  • Authors:
  • Pierfrancesco Fusaro;Filippo Lanubile;Giuseppe Visaggio

  • Affiliations:
  • Fraunhofer Institute for Experimental Software Engineering (IESE);University of Maryland, College Park, USA;University of Bari, Italy

  • Venue:
  • Empirical Software Engineering
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents the independent replication of acontrolled experiment which compared three defect detection techniques(Ad Hoc, Checklist, and Defect-based Scenario) for software requirementsinspections, and evaluated the benefits of collection meetingsafter individual reviews. The results of our replication werepartially different from those of the original experiment. Unlikethe original experiment, we did not find any empirical evidenceof better performance when using scenarios. To explain thesenegative findings we provide a list of hypotheses. On the otherhand, the replication confirmed one result of the original experiment:the defect detection rate is not improved by collection meetings.The independent replication was made possibleby the existence of an experimental kit provided by the originalinvestigators. We discuss what difficulties we encountered inapplying the package to our environment, as a result of differentcultures and skills. Using our results, experience and suggestions,other researchers will be able to improve the original experimentaldesign before attempting further replications.