Systems development in information systems research
Journal of Management Information Systems - Special issue on management support systems
Cross-Evaluation: A new model for information system evaluation
Journal of the American Society for Information Science and Technology
A Design Science Research Methodology for Information Systems Research
Journal of Management Information Systems
Design alternatives for the evaluation of design science research artifacts
Proceedings of the 4th International Conference on Design Science Research in Information Systems and Technology
Design and natural science research on information technology
Decision Support Systems
Design science in information systems research
MIS Quarterly
MIS Quarterly
MIS Quarterly
Design principles for research data export: lessons learned in e-health design research
DESRIST'13 Proceedings of the 8th international conference on Design Science at the Intersection of Physical and Virtual Design
ERP event log preprocessing: timestamps vs. accounting logic
DESRIST'13 Proceedings of the 8th international conference on Design Science at the Intersection of Physical and Virtual Design
Enriching process models for business process compliance checking in ERP environments
DESRIST'13 Proceedings of the 8th international conference on Design Science at the Intersection of Physical and Virtual Design
Design science in practice: designing an electricity demand response system
DESRIST'13 Proceedings of the 8th international conference on Design Science at the Intersection of Physical and Virtual Design
A framework for classifying design research methods
DESRIST'13 Proceedings of the 8th international conference on Design Science at the Intersection of Physical and Virtual Design
Respondent behavior logging: an opportunity for online survey design
DESRIST'13 Proceedings of the 8th international conference on Design Science at the Intersection of Physical and Virtual Design
Hi-index | 0.00 |
Evaluation is a central and essential activity in conducting rigorous Design Science Research (DSR), yet there is surprisingly little guidance about designing the DSR evaluation activity beyond suggesting possible methods that could be used for evaluation. This paper extends the notable exception of the existing framework of Pries-Heje et al [11] to address this problem. The paper proposes an extended DSR evaluation framework together with a DSR evaluation design method that can guide DSR researchers in choosing an appropriate strategy for evaluation of the design artifacts and design theories that form the output from DSR. The extended DSR evaluation framework asks the DSR researcher to consider (as input to the choice of the DSR evaluation strategy) contextual factors of goals, conditions, and constraints on the DSR evaluation, e.g. the type and level of desired rigor, the type of artifact, the need to support formative development of the designed artifacts, the properties of the artifact to be evaluated, and the constraints on resources available, such as time, labor, facilities, expertise, and access to research subjects. The framework and method support matching these in the first instance to one or more DSR evaluation strategies, including the choice of ex ante (prior to artifact construction) versus ex post evaluation (after artifact construction) and naturalistic (e.g., field setting) versus artificial evaluation (e.g., laboratory setting). Based on the recommended evaluation strategy(ies), guidance is provided concerning what methodologies might be appropriate within the chosen strategy(ies).