Tailoring the software process to project goals and environments
ICSE '87 Proceedings of the 9th international conference on Software Engineering
Why CSCW applications fail: problems in the design and evaluationof organizational interfaces
CSCW '88 Proceedings of the 1988 ACM conference on Computer-supported cooperative work
Quantitative evaluation of software quality
ICSE '76 Proceedings of the 2nd international conference on Software engineering
An empirical evaluation of the G/Q/M method
CASCON '93 Proceedings of the 1993 conference of the Centre for Advanced Studies on Collaborative research: software engineering - Volume 1
Hi-index | 0.00 |
This paper introduces an evaluation method that provides the capability of comparing results of like-structured evaluations that occur over time and with changing toolsets or environmental conditions. This makes use of the framework ideal for comparison of collaboration tools. The framework helps to structure evaluations by mapping system goals to evaluation objectives, metrics, and measures. The upper-most levels of the framework are conceptual in nature, while the bottom level is implementation-specific, i.e., evaluation-specific. Careful attention during construction of the conceptual elements for an evaluation template allows for its reuse in a series of like-structured evaluations and comparison of those results.