Information criteria: How do they behave in different models?

  • Authors:
  • Paulo C. Emiliano;Mário J. F. Vivanco;Fortunato S. De Menezes

  • Affiliations:
  • -;-;-

  • Venue:
  • Computational Statistics & Data Analysis
  • Year:
  • 2014

Quantified Score

Hi-index 0.03

Visualization

Abstract

The choice of the best model is crucial in modeling data, and parsimony is one of the principles that must guide this choice. Despite their broad use in model selection, the foundations of the Akaike information criterion (AIC), the corrected Akaike criterion (AICc) and the Bayesian information criterion (BIC) are, in general, poorly understood. The AIC, AICc and BIC penalize the likelihoods in order to select the simplest model. These criteria are based upon concepts of information and entropy, which are explained in this work, by focusing on a statistical approach. The three criteria are compared through Monte Carlo simulations, and the applications of these criteria are investigated in the selection of normal models, the selection of biological growth models and selection of time series models. For the simulation with normal models, all three criteria exhibited poor performance for a small sample size N=100 (particularly, when the variances are slightly different). For biological growth model simulations with a very small sample size N=13 the AIC and AICc showed better performance in comparison to the BIC. The simulation based on time series models produced results similar to the normal model simulations. For these simulations, the BIC exhibited superior performance, in some cases, in comparison to the other two information criteria (AIC and AICc) for a small sample size N=100, but in other cases, the BIC performed poorly, as did the AIC and AICc.