Performance evaluation of competing forecasting models: A multidimensional framework based on MCDA

  • Authors:
  • Bing Xu;Jamal Ouenniche

  • Affiliations:
  • Aberdeen Business School, Robert Gordon University, Garthdee Road, Aberdeen AB10 7QE, UK;Business School, The University of Edinburgh, 29 Buccleuch Place, Edinburgh EH8 9JS, UK and School of Business, ESC Rennes, Rennes, France

  • Venue:
  • Expert Systems with Applications: An International Journal
  • Year:
  • 2012

Quantified Score

Hi-index 12.05

Visualization

Abstract

So far, competing forecasting models are compared to each other using a single criterion at a time, which often leads to different rankings for different criteria - a situation where one cannot make an informed decision as to which model performs best overall; that is, taking all performance criteria into account. To overcome this methodological problem, we propose to use a Multi-Criteria Decision Analysis (MCDA) based framework and discuss how one might adapt it to address the problem of relative performance evaluation of competing forecasting models. Three outranking methods have been used in our empirical experiments to rank order competing forecasting models of crude oil prices; namely, ELECTRE III, PROMETHEE I, and PROMETHEE II. Our empirical results reveal that the multidimensional framework provides a valuable tool to apprehend the true nature of the relative performance of competing forecasting models. In addition, as far as the evaluation of the relative performance of the forecasting models considered in this study is concerned, the rankings of the best and the worst performing models do not seem to be sensitive to the choice of importance weights or outranking methods, which suggest that the ranks of these models are robust.