Using tools for verification, documentation and testing

  • Authors:
  • Loen J. Osterweil

  • Affiliations:
  • -

  • Venue:
  • Proceedings of the SIGNUM Conference on the Programming Environment for Development of Numerical Software
  • Year:
  • 1978

Quantified Score

Hi-index 0.00

Visualization

Abstract

There has been considerable interest lately in methodologies for the production of high quality computer software. Work in this area has been carried out by researchers in a wide variety of disciplines and covers an impressive spectrum of approaches. Some of the more active current lines of research include: software management techniques [1, 2]; creation of error resistant programming techniques [3, 4, 5]; and design of error resistant programming languages [6, 7]. There has also been considerable activity in the creation of program testing, verification and documentation tools. The work in this area has been directed primarily towards two different but related goals—the detection and examination of errors present in a program, and the determination that a given program has no errors of some particular type. Among the diverse activities in this area, this paper shall focus on four of the major approaches—namely dynamic testing, symbolic execution, formal verification and static analysis. In this paper, the different patterns of strengths, weaknesses and applications of these approaches will be shown. It will, moreover, be demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.