Experimental assessment of software metrics using automated refactoring

  • Authors:
  • Mel Ó Cinnéide;Laurence Tratt;Mark Harman;Steve Counsell;Iman Hemati Moghadam

  • Affiliations:
  • University College Dublin, Dublin, Ireland;King's College London, London, United Kingdom;University College London, London, United Kingdom;Brunel University, London, United Kingdom;University College Dublin, Dublin, Ireland

  • Venue:
  • Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

A large number of software metrics have been proposed in the literature, but there is little understanding of how these metrics relate to one another. We propose a novel experimental technique, based on search-based refactoring, to assess software metrics and to explore relationships between them. Our goal is not to improve the program being refactored, but to assess the software metrics that guide the auto- mated refactoring through repeated refactoring experiments. We apply our approach to five popular cohesion metrics using eight real-world Java systems, involving 300,000 lines of code and over 3,000 refactorings. Our results demonstrate that cohesion metrics disagree with each other in 55% of cases, and show how our approach can be used to reveal novel and surprising insights into the software metrics under investigation.