Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
Mathematics and Computers in Simulation - IMACS sponsored Special issue on the second IMACS seminar on Monte Carlo methods
Computational Laboratories for Organization Science: Questions, Validity and Docking
Computational & Mathematical Organization Theory
Verification and validation: verification and validation of simulation models
Proceedings of the 35th conference on Winter simulation: driving innovation
Software Verification and Validation: An Engineering and Scientific Approach
Software Verification and Validation: An Engineering and Scientific Approach
Position Paper: Modelling with stakeholders
Environmental Modelling & Software
Technical assessment and evaluation of environmental models and software: Letter to the Editor
Environmental Modelling & Software
A method for the analysis of assumptions in model-based environmental assessments
Environmental Modelling & Software
Verification and Validation in Scientific Computing
Verification and Validation in Scientific Computing
Position Paper: The role of expert opinion in environmental modelling
Environmental Modelling & Software
Position paper: Characterising performance of environmental models
Environmental Modelling & Software
Verification and validation of simulation models
Proceedings of the Winter Simulation Conference
Hi-index | 0.00 |
Integrated Assessment Models of global climate change (IAMs) are an established tool to study interlinkages between the human and the natural system. Insights from these complex models are widely used to advise policy-makers and to inform the general public. But up to now there has been little understanding of how these models can be evaluated and community-wide standards are missing. To answer this urgent question is a challenge because the systems are open and their future behavior is fundamentally unknown. In this paper, we discuss ways to overcome these problems. Reflecting on experience from other modeling communities, we develop an evaluation framework for IAM of global climate change. It builds on a systematic and transparent step-by-step demonstration of a model's usefulness testing the plausibility of its behavior. Steps in the evaluation hierarchy are: setting up an evaluation framework, evaluation of the conceptual model, code verification and documentation, model evaluation, uncertainty and sensitivity analysis, documentation of the evaluation process, and communication with stakeholders. An important element in evaluating IAM of global climate change is the use of stylized behavior patterns derived from historical observation. The discussion of two examples is offered in this paper.