Software testing techniques (2nd ed.)
Software testing techniques (2nd ed.)
The chaining approach for software test data generation
ACM Transactions on Software Engineering and Methodology (TOSEM)
Assertion-oriented automated test data generation
Proceedings of the 18th international conference on Software engineering
Extended static checking for Java
PLDI '02 Proceedings of the ACM SIGPLAN 2002 Conference on Programming language design and implementation
Art of Software Testing
Software Engineering: A Practitioner's Approach
Software Engineering: A Practitioner's Approach
A Comparison of Bug Finding Tools for Java
ISSRE '04 Proceedings of the 15th International Symposium on Software Reliability Engineering
An Approach to Test Data Generation for Killing Multiple Mutants
ICSM '06 Proceedings of the 22nd IEEE International Conference on Software Maintenance
Finding failure-inducing changes in java programs using change classification
Proceedings of the 14th ACM SIGSOFT international symposium on Foundations of software engineering
Code-coverage guided prioritized test generation
Information and Software Technology
Evaluating static analysis defect warnings on production software
PASTE '07 Proceedings of the 7th ACM SIGPLAN-SIGSOFT workshop on Program analysis for software tools and engineering
Automatic Generation of Floating-Point Test Data
IEEE Transactions on Software Engineering
On the Automated Generation of Program Test Data
IEEE Transactions on Software Engineering
Software Engineering
PathART: path-sensitive adaptive random testing
Proceedings of the 5th Asia-Pacific Symposium on Internetware
Hi-index | 0.00 |
Software testing is an important technique to assure the quality of software systems, especially high-confidence systems. To automate the process of software testing, many automatic test-data generation techniques have been proposed. To generate effective test data, we propose a test-data generation technique guided by static defect detection in this paper. Using static defect detection analysis, our approach first identifies a set of suspicious statements which are likely to contain faults, then generates test data to cover these suspicious statements by converting the problem of test-data generation to the constraint satisfaction problem. We performed a case study to validate the effectiveness of our approach, and made a simple comparison with another test-data generation on-line tool, JUnit Factory. The results show that, compared with JUnit Factory, our approach generates fewer test data that are competitive on fault detection.