Elements of information theory
Elements of information theory
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
No free lunch for early stopping
Neural Computation
Mathematics of Generalization: Proceedings: SFI-CNLS Workshop on Formal Approaches to Supervised Learning (1992: Santa Fe, N. M.)
The lack of a priori distinctions between learning algorithms
Neural Computation
The existence of a priori distinctions between learning algorithms
Neural Computation
No free lunch for cross-validation
Neural Computation
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Asymptotic statistical theory of overtraining and cross-validation
IEEE Transactions on Neural Networks
On different facets of regularization theory
Neural Computation
Determination and the no-free-lunch paradox
Neural Computation
Hi-index | 0.01 |
No-free-lunch theorems have shown that learning algorithms cannot be universally good. We show that no free funch exists for noise prediction as well. We show that when the noise is additive and the prior over target functions is uniform, a prior on the noise distribution cannot be updated, in the Bayesian sense, from any finite data set. We emphasize the importance of a prior over the target function in order to justify superior performance for learning systems.