Note on free lunches and cross-validation
Neural Computation
On cross validation for model selection
Neural Computation
No free lunch for early stopping
Neural Computation
On different facets of regularization theory
Neural Computation
Inference for the Generalization Error
Machine Learning
No Free Lunch for Noise Prediction
Neural Computation
Antipredictable Sequences: Harder to Predict Than Random Sequences
Neural Computation
International Journal of Remote Sensing
Semi-analytical method for analyzing models and model selection measures based on moment analysis
ACM Transactions on Knowledge Discovery from Data (TKDD)
Determination and the no-free-lunch paradox
Neural Computation
Hi-index | 0.00 |
It is known theoretically that an algorithm cannot be good for an arbitrary prior. We show that in practical terms this also applies to the technique of “cross-validation,” which has been widely regarded as defying this general rule. Numerical examples are analyzed in detail. Their implications to researches on learning algorithms are discussed.