A small sample model selection criterion based on Kullback's symmetric divergence

  • Authors:
  • A.-K. Seghouane;M. Bekara

  • Affiliations:
  • Inst. Nat. de Recherche en Informatique et en Automatique, Le Chesnay, France;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2004

Quantified Score

Hi-index 35.68

Visualization

Abstract

The Kullback information criterion (KIC) is a recently developed tool for statistical model selection. KIC serves as an asymptotically unbiased estimator of a variant (within a constant) of the Kullback symmetric divergence, known also as J-divergence between the generating model and the fitted candidate model. In this paper, a bias correction to KIC is derived for linear regression models. The correction is of particular use when the sample size is small or when the number of fitted parameters is a moderate to large fraction of the sample size. For linear regression models, the corrected criterion, called KICc is an exactly unbiased estimator of the variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Furthermore, when applied to polynomial regression and autoregressive time-series modeling, KICc is found to estimate the model order more accurately than any other asymptotically efficient method. Finally, KICc is tested on real data to forecast foreign currency exchange rate; the result is very interesting in comparison to classical techniques.