The relationship between search based software engineering and predictive modeling
Proceedings of the 6th International Conference on Predictive Models in Software Engineering
Information and Software Technology
Cooperative co-evolutionary optimization of software project staff assignments and job scheduling
SSBSE'11 Proceedings of the Third international conference on Search based software engineering
Search based software engineering: techniques, taxonomy, tutorial
Empirical Software Engineering and Verification
Automatically RELAXing a goal model to cope with uncertainty
SSBSE'12 Proceedings of the 4th international conference on Search Based Software Engineering
Controversy Corner: Search Based Software Engineering: Review and analysis of the field in Brazil
Journal of Systems and Software
Does automated white-box test generation really help software testers?
Proceedings of the 2013 International Symposium on Software Testing and Analysis
Exact scalable sensitivity analysis for the next release problem
ACM Transactions on Software Engineering and Methodology (TOSEM)
Hi-index | 0.00 |
This paper reports a comprehensive experimental study regarding the human competitiveness of search based software engineering (SBSE). The experiments were performed over four well-known SBSE problem formulations: next release problem, multi-objective next release problem, workgroup formation problem and the multi-objective test case selection problem. For each of these problems, two instances, with increasing sizes, were synthetically generated and solved by both metaheuristics and human subjects. A total of 63 professional software engineers participated in the experiment by solving some or all problem instances, producing together 128 responses. The comparison analysis strongly suggests that the results generated by search based software engineering can be said to be human competitive.