Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Subspace Information Criterion for Model Selection
Neural Computation
Fisher information and stochastic complexity
IEEE Transactions on Information Theory
A decision-theoretic extension of stochastic complexity and its applications to learning
IEEE Transactions on Information Theory
Asymptotic statistical theory of overtraining and cross-validation
IEEE Transactions on Neural Networks
Model complexity control for regression using VC generalization bounds
IEEE Transactions on Neural Networks
A unified method for optimizing linear image restoration filters
Signal Processing - Image and Video Coding beyond Standards
The subspace information criterion for infinite dimensional hypothesis spaces
The Journal of Machine Learning Research
Generalization Error Estimation for Non-linear Learning Methods
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Hi-index | 0.00 |
Recently, a new model selection criterion called the subspace information criterion (SIC) was proposed. SIC works well with small samples since it gives an unbiased estimate of the generalization error with finite samples. In this paper, we theoretically and experimentally evaluate the effectiveness of SIC in comparison with existing model selection techniques including the traditional leave-one-out cross-validation (CV), Mallows's CP, Akaike's information criterion (AIC), Sugiura's corrected AIC (cAIC), Schwarz's Bayesian information criterion (BIC), Rissanen's minimum description length criterion (MDL), and Vapnik's measure (VM). Theoretical evaluation includes the comparison of the generalization measure, approximation method, and restriction on model candidates and learning methods. Experimentally, the performance of SIC in various situations is investigated. The simulations show that SIC outperforms existing techniques especially when the number of training examples is small and the noise variance is large.