A refactoring tool for Smalltalk
Theory and Practice of Object Systems - Special issue object-oriented software evolution and re-engineering
Extended static checking for Java
PLDI '02 Proceedings of the ACM SIGPLAN 2002 Conference on Programming language design and implementation
OOPSLA '04 Companion to the 19th annual ACM SIGPLAN conference on Object-oriented programming systems, languages, and applications
Correlation exploitation in error ranking
Proceedings of the 12th ACM SIGSOFT twelfth international symposium on Foundations of software engineering
IEEE Security and Privacy
Check 'n' crash: combining static checking and testing
Proceedings of the 27th international conference on Software engineering
Automatic Mining of Source Code Repositories to Improve Bug Finding Techniques
IEEE Transactions on Software Engineering
Integrating Static and Dynamic Analysis for Detecting Vulnerabilities
COMPSAC '06 Proceedings of the 30th Annual International Computer Software and Applications Conference - Volume 01
Prioritizing Software Inspection Results using Static Profiling
SCAM '06 Proceedings of the Sixth IEEE International Workshop on Source Code Analysis and Manipulation
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Data Mining Static Code Attributes to Learn Defect Predictors
IEEE Transactions on Software Engineering
Prioritizing Warning Categories by Analyzing Software History
MSR '07 Proceedings of the Fourth International Workshop on Mining Software Repositories
Predicting Defects for Eclipse
PROMISE '07 Proceedings of the Third International Workshop on Predictor Models in Software Engineering
Which warnings should I fix first?
Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering
DSD-Crasher: A hybrid analysis tool for bug finding
ACM Transactions on Software Engineering and Methodology (TOSEM)
Predicting accurate and actionable static analysis warnings: an experimental approach
Proceedings of the 30th international conference on Software engineering
Proceedings of the Second ACM-IEEE international symposium on Empirical software engineering and measurement
A Model Building Process for Identifying Actionable Static Analysis Alerts
ICST '09 Proceedings of the 2009 International Conference on Software Testing Verification and Validation
Z-ranking: using statistical analysis to counter the impact of static analysis approximations
SAS'03 Proceedings of the 10th international conference on Static analysis
A systematic model building process for predicting actionable static analysis alerts
A systematic model building process for predicting actionable static analysis alerts
Information and Software Technology
EFindBugs: Effective Error Ranking for FindBugs
ICST '11 Proceedings of the 2011 Fourth IEEE International Conference on Software Testing, Verification and Validation
A Framework to Compare Alert Ranking Algorithms
WCRE '12 Proceedings of the 2012 19th Working Conference on Reverse Engineering
Hi-index | 0.00 |
Automated static analysis (ASA) tools can identify potential source code anomalies that could lead to field failures. Developer inspection is required to determine if an ASA alert is important enough to fix, or an actionable alert. Supplementing current ASA tools with automated identification of actionable alerts could reduce developer inspection overhead, leading to an increase in industry adoption of ASA tools. The goal of this research is to inform the selection of an actionable alert identification technique for ranking the output of automated static analysis through a comparative evaluation of actionable alert identification techniques. We investigated six actionable alert identification techniques on three subject projects. Among these six techniques, the systematic actionable alert identification (SAAI) technique reported an average accuracy of 82.5% across the three subject projects when considering both ASA tools evaluated. Check 'n' Crash reported an average accuracy of 85.8% for the single ASA tool evaluated. The other actionable alert identification techniques had average accuracies ranging from 42.2%-78.2%.