Experimentation in software engineering: an introduction
Experimentation in software engineering: an introduction
An empirical evaluation of fault-proneness models
Proceedings of the 24th International Conference on Software Engineering
Assessing the applicability of fault-proneness models across object-oriented software projects
IEEE Transactions on Software Engineering
Comparative Assessment of Software Quality Classification Techniques: An Empirical Case Study
Empirical Software Engineering
Comparing Fault-Proneness Estimation Models
ICECCS '05 Proceedings of the 10th IEEE International Conference on Engineering of Complex Computer Systems
MSR '05 Proceedings of the 2005 international workshop on Mining software repositories
Analyzing Software Quality with Limited Fault-Proneness Defect Data
HASE '05 Proceedings of the Ninth IEEE International Symposium on High-Assurance Systems Engineering
Data Mining Static Code Attributes to Learn Defect Predictors
IEEE Transactions on Software Engineering
Spam Filter Based Approach for Finding Fault-Prone Software Modules
MSR '07 Proceedings of the Fourth International Workshop on Mining Software Repositories
Training on errors experiment to detect fault-prone software modules by spam filter
Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering
Learning from bug-introducing changes to prevent fault prone code
Ninth international workshop on Principles of software evolution: in conjunction with the 6th ESEC/FSE joint meeting
The Effects of Over and Under Sampling on Fault-prone Module Detection
ESEM '07 Proceedings of the First International Symposium on Empirical Software Engineering and Measurement
Data Mining Techniques for Building Fault-proneness Models in Telecom Java Software
ISSRE '07 Proceedings of the The 18th IEEE International Symposium on Software Reliability
Hi-index | 0.00 |
Fault-prone module detection in source code is important for assurance of software quality. Most previous fault-prone detection approaches have been based on software metrics. Such approaches, however, have difficulties in collecting the metrics and in constructing mathematical models based on the metrics. To mitigate such difficulties, we have proposed a novel approach for detecting fault-prone modules using a spam-filtering technique, named Fault-Prone Filtering. In our approach, fault-prone modules are detected in such a way that the source code modules are considered as text files and are applied to the spam filter directly. In practice, we use the training only errors procedure and apply this procedure to fault-prone. Since no pre-training is required, this procedure can be applied to an actual development field immediately. This paper describes an extension of the training only errors procedures. We introduce a precise unit of training, "modified lines of code," instead of methods. In addition, we introduce the dynamic threshold for classification. The result of the experiment shows that our extension leads to twice the precision with about the same recall, and improves 15% on the best F1 measurement.