Toward actionable, broadly accessible contests in software engineering

  • Authors:
  • Jane Cleland-Huang;Yonghee Shin;Ed Keenan;Adam Czauderna;Greg Leach;Evan Moritz;Malcom Gethers;Denys Poshyvanyk;Jane Huffman Hayes;Wenbin Li

  • Affiliations:
  • DePaul University, USA;DePaul University, USA;DePaul University, USA;DePaul University, USA;DePaul University, USA;College of William and Mary, USA;College of William and Mary, USA;College of William and Mary, USA;University of Kentucky, USA;University of Kentucky, USA

  • Venue:
  • Proceedings of the 34th International Conference on Software Engineering
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Software Engineering challenges and contests are becoming increasingly popular for focusing researchers' efforts on particular problems. Such contests tend to follow either an exploratory model, in which the contest holders provide data and ask the contestants to discover ``interesting things'' they can do with it, or task-oriented contests in which contestants must perform a specific task on a provided dataset. Only occasionally do contests provide more rigorous evaluation mechanisms that precisely specify the task to be performed and the metrics that will be used to evaluate the results. In this paper, we propose actionable and crowd-sourced contests: actionable because the contest describes a precise task, datasets, and evaluation metrics, and also provides a downloadable operating environment for the contest; and crowd-sourced because providing these features creates accessibility to Information Technology hobbyists and students who are attracted by the challenge. Our proposed approach is illustrated using research challenges from the software traceability area as well as an experimental workbench named TraceLab.