Proc. of the European symposium on programming on ESOP 86
Understanding Z: a specification language and its formal semantics
Understanding Z: a specification language and its formal semantics
Software testing techniques (2nd ed.)
Software testing techniques (2nd ed.)
Software engineering (3rd ed.): a practitioner's approach
Software engineering (3rd ed.): a practitioner's approach
The B-book: assigning programs to meanings
The B-book: assigning programs to meanings
FORTEST: Formal Methods and Testing
COMPSAC '02 Proceedings of the 26th International Computer Software and Applications Conference on Prolonging Software Life: Development and Redevelopment
CASTING: A Formally Based Software Test Generation Method
ICFEM '97 Proceedings of the 1st International Conference on Formal Engineering Methods
Counterexample-guided abstraction refinement for symbolic model checking
Journal of the ACM (JACM)
Test case generation from formal models through abstraction refinement and model checking
Proceedings of the 3rd international workshop on Advances in model-based testing
Testing Formal Specifications to Detect Design Errors
IEEE Transactions on Software Engineering
BETA: a b based testing approach
SBMF'12 Proceedings of the 15th Brazilian conference on Formal Methods: foundations and applications
Hi-index | 0.00 |
Testing is very important part of software development. Almost 80% software fails because of the improper or inefficient testing. Testing is performed by different types of strategies. Generally testing is performed on code, but if the software can be tested in the earlier phases then most of the errors can be eliminated and can be stopped from propagating to next phase. Thus there is a need to explore testing possibilities in earlier phases. This paper present a novel requirement based testing approach that can fix errors in initial phase. Formal Specification languages play a vital role in software testing. Formal models provide a precise specification of the system, and can be used as a vehicle for driving the development process. To perform requirement based testing, we need a formal language that can deal with the requirement specification efficiently. Many researchers have proposed various approaches to generate test cases from formal specifications. These approaches include test case generation from various state based languages like Z, VDM and B specifications. In this paper we proposed a technique that can provide better coverage of requirements as compared to other approaches. For maximizing the coverage of requirements in our model, we annotate our specifications with requirement identifiers, which help in later stages to detect which requirements are covered and which are yet to be tested. Test cases are generated by extracting invariants and post conditions from our specification, and are transformed in a generalized form. Using test selection criteria, we can cover all parts of our model and generates test cases for each of our test objective