Automated test case generation for spreadsheets

  • Authors:
  • Marc Fisher;Mingming Cao;Gregg Rothermel;Curtis R. Cook;Margaret M. Burnett

  • Affiliations:
  • Oregon State University, Corvallis, Oregon;Oregon State University, Corvallis, Oregon;Oregon State University, Corvallis, Oregon;Oregon State University, Corvallis, Oregon;Oregon State University, Corvallis, Oregon

  • Venue:
  • Proceedings of the 24th International Conference on Software Engineering
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Spreadsheet languages, which include commercial spreadsheets and various research systems, have had a substantial impact on end-user computing. Research shows, however, that spreadsheets often contain faults. Thus, in previous work, we presented a methodology that assists spreadsheet users in testing their spreadsheet formulas. Our empirical studies have shown that this methodology can help end-users test spreadsheets more adequately and efficiently; however, the process of generating test cases can still represent a significant impediment. To address this problem, we have been investigating how to automate test case generation for spreadsheets in ways that support incremental testing and provide immediate visual feedback. We have utilized two techniques for generating test cases, one involving random selection and one involving a goal-oriented approach. We describe these techniques, and report results of an experiment examining their relative costs and benefits.