An approach to large-scale collection of application usage data over the Internet
Proceedings of the 20th international conference on Software engineering
Residual test coverage monitoring
Proceedings of the 21st international conference on Software engineering
Gamma system: continuous evolution of software after deployment
ISSTA '02 Proceedings of the 2002 ACM SIGSOFT international symposium on Software testing and analysis
Visualization of program-execution data for deployed software
Proceedings of the 2003 ACM symposium on Software visualization
Improving web application testing with user session data
Proceedings of the 25th International Conference on Software Engineering
Leveraging field data for impact analysis and regression testing
Proceedings of the 9th European software engineering conference held jointly with 11th ACM SIGSOFT international symposium on Foundations of software engineering
Skoll: Distributed Continuous Quality Assurance
Proceedings of the 26th International Conference on Software Engineering
Active learning for automatic classification of software behavior
ISSTA '04 Proceedings of the 2004 ACM SIGSOFT international symposium on Software testing and analysis
Scalable statistical bug isolation
Proceedings of the 2005 ACM SIGPLAN conference on Programming language design and implementation
Profiling Deployed Software: Assessing Strategies and Testing Opportunities
IEEE Transactions on Software Engineering
Probe Distribution Techniques to Profile Events in Deployed Software
ISSRE '06 Proceedings of the 17th International Symposium on Software Reliability Engineering
Techniques for Classifying Executions of Deployed Software to Support Software Engineering Tasks
IEEE Transactions on Software Engineering
Hi-index | 0.00 |
Analyzing a deployed software provides a means to characterize and leverage the software's runtime behavior as it is employed by its intended users. Preliminary studies have shown that leveraging the information obtained from the field provides engineers an opportunity to improve their software testing activities. The analysis of a deployed software can be performed in three stages: (1) the analysis to determine, before the software is deployed, where the instrumentation probes should be inserted into the software and what information that they should capture, (2) the analysis to determine when the field data should be sent back to the company during deployment, and (3) the analysis to leverage the field information after deployment. To make the analysis activities more feasible, we need to take into consideration that there are distinct characteristic differences between the development and the deployed environment. Deployed environment allows for less overhead, provides less control for the engineers, and requires highly scalable techniques due to the high volume of information. Hence, the existing approaches for in-house analysis may become ineffective, inefficient, or even useless when they are directly applied to the deployed environment. Existing approaches for analyzing deployed software also need to be more aware that a technique in one analysis stage may affect the performance of a technique in other analysis stage. This research proposal details the challenges that arise when analyzing a deployed software and seeks to develop a set techniques to address these challenges that can be applied to each stage or across the analysis stages.