BERT: BEhavioral Regression Testing

  • Authors:
  • Alessandro Orso;Tao Xie

  • Affiliations:
  • Georgia Institute of Technology;North Carolina State University

  • Venue:
  • WODA '08 Proceedings of the 2008 international workshop on dynamic analysis: held in conjunction with the ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2008)
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

During maintenance, it is common to run the new version of a program against its existing test suite to check whether the modifications in the program introduced unforeseen side effects. Although this kind of regression testing can be effective in identifying some change-related faults, it is limited by the quality of the existing test suite. Because generating tests for real programs is expensive, developers build test suites by finding acceptable tradeoffs between cost and thoroughness of the tests. Such test suites necessarily target only a small subset of the program's functionality and may miss many regression faults. To address this issue, we introduce the concept of behavioral regression testing, whose goal is to identify behavioral differences between two versions of a program through dynamic analysis. Intuitively, given a set of changes in the code, behavioral regression testing works by (1) generating a large number of test cases that focus on the changed parts of the code, (2) running the generated test cases on the old and new versions of the code and identifying differences in the tests' outcome, and (3) analyzing the identified differences and presenting them to the developers. By focusing on a subset of the code and leveraging differential behavior, our approach can provide developers with more (and more focused) information than traditional regression testing techniques. This paper presents our approach and performs a preliminary assessment of its feasibility.