Automating regression testing for evolving GUI software: Research Articles
Journal of Software Maintenance and Evolution: Research and Practice - 2003 International Conference on Software Maintenance: The Architectural Evolution of Systems
TimeAware test suite prioritization
Proceedings of the 2006 international symposium on Software testing and analysis
Performance Measurement of Novice HPC Programmers Code
SE-HPC '07 Proceedings of the 3rd International Workshop on Software Engineering for High Performance Computing Applications
Efficient time-aware prioritization with knapsack solvers
Proceedings of the 1st ACM international workshop on Empirical assessment of software engineering languages and technologies: held in conjunction with the 22nd IEEE/ACM International Conference on Automated Software Engineering (ASE) 2007
Test generation for graphical user interfaces based on symbolic execution
Proceedings of the 3rd international workshop on Automation of software test
Covering code behavior on input validation in functional testing
Information and Software Technology
Event Listener Analysis and Symbolic Execution for Testing GUI Applications
ICFEM '09 Proceedings of the 11th International Conference on Formal Engineering Methods: Formal Methods and Software Engineering
Harnessing web-based application similarities to aid in regression testing
ISSRE'09 Proceedings of the 20th IEEE international conference on software reliability engineering
A proposal for automatic testing of GUIs based on annotated use cases
Advances in Software Engineering - Special issue on software test automation
Search-based system testing: high coverage, no false alarms
Proceedings of the 2012 International Symposium on Software Testing and Analysis
EXSYST: search-based GUI testing
Proceedings of the 34th International Conference on Software Engineering
Hi-index | 0.00 |
"Nightly/daily building and smoke testing" have becomewidespread since they often reveal bugs early in the softwaredevelopment process. During these builds, software iscompiled, linked, and (re)tested with the goal of validatingits basic functionality. Although successful for conventionalsoftware, smoke tests are difficult to develop and automaticallyrerun for software that has a graphical user interface(GUI). In this paper, we describe a framework calledDART (Daily Automated Regression Tester) that addressesthe needs of frequent and automated re-testing of GUI software.The key to our success is automation: DART automateseverything from structural GUI analysis, test casegeneration, test oracle creation, to code instrumentation,test execution, coverage evaluation, regeneration of testcases, and their re-execution. Together with the operatingsystem's task scheduler, DART can execute frequently withlittle input from the developer/tester to retest the GUI software.We provide results of experiments showing the timetaken and memory required for GUI analysis, test case andtest oracle generation, and test execution. We also empiricallycompare the relative costs of employing different levelsof detail in the GUI test cases.