An equivalence between sparse approximation and support vector machines
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
The subspace information criterion for infinite dimensional hypothesis spaces
The Journal of Machine Learning Research
Subspace Information Criterion for Model Selection
Neural Computation
Analytic Optimization of Shrinkage Parameters Based on Regularized Subspace Information Criterion
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Hi-index | 0.00 |
In order to obtain better generalization performance in supervised learning, model parameters should be determined appropriately, i.e., they should be determined so that the generalization error is minimized. However, since the generalization error is inaccessible in practice, the model parameters are usually determined so that an estimator of the generalization error is minimized. The regularized subspace information criterion (RSIC) is such a generalization error estimator for model selection. RSIC includes an additional regularization parameter and it should be determined appropriately for better model selection. A meta-criterion for determining the regularization parameter has also been proposed and shown to be useful in practice. In this paper, we show that there are several drawbacks in the existing meta-criterion and give an alternative meta-criterion that can solve the problems. Through simulations, we show that the use of the new meta-criterion further improves the model selection performance.