Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Note on free lunches and cross-validation
Neural Computation
A perspective view and survey of meta-learning
Artificial Intelligence Review
Functions as Permutations: Regarding No Free Lunch, Walsh Analysis and Summary Statistics
PPSN VI Proceedings of the 6th International Conference on Parallel Problem Solving from Nature
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
No Free Lunch for Noise Prediction
Neural Computation
The lack of a priori distinctions between learning algorithms
Neural Computation
No free lunch for cross-validation
Neural Computation
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
We discuss the no-free-lunch NFL theorem for supervised learning as a logical paradox-that is, as a counterintuitive result that is correctly proven from apparently incontestable assumptions. We show that the uniform prior that is used in the proof of the theorem has a number of unpalatable consequences besides the NFL theorem, and propose a simple definition of determination (by a learning set of given size) that casts additional suspicion on the utility of this assumption for the prior. Whereas others have suggested that the assumptions of the NFL theorem are not practically realistic, we show these assumptions to be at odds with supervised learning in principle. This analysis suggests a route toward the establishment of a more realistic prior probability for use in the extended Bayesian framework.