Statistical aspects of model selection
From data to model
Segmenting Images Corrupted by Correlated Noise
IEEE Transactions on Pattern Analysis and Machine Intelligence
Asymptotic MAP criteria for model selection
IEEE Transactions on Signal Processing
A small sample model selection criterion based on Kullback's symmetric divergence
IEEE Transactions on Signal Processing
Blind channel approximation: effective channel order determination
IEEE Transactions on Signal Processing
Paper: Modeling by shortest data description
Automatica (Journal of IFAC)
Minimax description length for signal denoising and optimized representation
IEEE Transactions on Information Theory
The AIC Criterion and Symmetrizing the Kullback–Leibler Divergence
IEEE Transactions on Neural Networks
Hi-index | 0.08 |
The Akaike information criterion, AIC, and its corrected version, AIC"c are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback-Leibler information between the model generating the data and the approximating candidate model. In this paper, two new corrected variants of AIC are derived for the purpose of small sample linear regression model selection. The proposed variants of AIC are based on asymptotic approximation of bootstrap type estimates of Kullback-Leibler information. These new variants are of particular interest when the use of bootstrap is not really justified in terms of the required calculations. As its the case for AIC"c, these new variants are asymptotically equivalent to AIC. Simulation results which illustrate better performance of the proposed AIC corrections when applied to polynomial regression in comparison to AIC, AIC"c and other criteria are presented. Asymptotic justifications for the proposed criteria are provided in the Appendix.