Computational learning theory: an introduction
Computational learning theory: an introduction
Polynomial bounds for VC dimension of sigmoidal neural networks
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
Fast learning in networks of locally-tuned processing units
Neural Computation
Confidence intervals for the risks of regression models
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Hinging hyperplanes for regression, classification, and function approximation
IEEE Transactions on Information Theory
Detection of chain structures embedded in multidimensional symbolic data
Pattern Recognition Letters
Hi-index | 0.01 |
This paper presents a new method of cross-validation (CV) for nonlinear regression problems. In the conventional CV methods, a validation set, that is, a part of training data is used to check the performance of learning. As a result, the trained regression models cannot utilize the whole training data and obtain the less performance than the expected for the given training data. In this context, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggest a parameterized form of risk estimates for the performance prediction model. As a result, we can estimate the optimal structure for the whole training data using the suggested CV method referred to as the parameterize CV (p-CV) method. Through the simulation for function approximation, we have shown the effectiveness of our approach.