Statistical aspects of model selection
From data to model
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
An Adaptive Regularization Criterion for Supervised Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Regularization Learning and Early Stopping in Linear Networks
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 4 - Volume 4
Subspace Information Criterion for Model Selection
Neural Computation
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Model complexity control for regression using VC generalization bounds
IEEE Transactions on Neural Networks
The subspace information criterion for infinite dimensional hypothesis spaces
The Journal of Machine Learning Research
Matrix-pattern-oriented Ho-Kashyap classifier with regularization learning
Pattern Recognition
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Semi-supervised learning based on high density region estimation
Neural Networks
On the selection of weight decay parameter for faulty networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The problem of designing the regularization term and regularization parameter for linear regression models is discussed. Previously, we derived an approximation to the generalization error called the subspace information criterion (SIC), which is an unbiased estimator of the generalization error with finite samples under certain conditions. In this paper, we apply SIC to regularization learning and use it for: (a) choosing the optimal regularization term and regularization parameter from the given candidates; (b) obtaining the closed form of the optimal regularization parameter for a fixed regularization term. The effectiveness of SIC is demonstrated through computer simulations with artificial and real data.