No Free Lunch for Noise Prediction

  • Authors:
  • Malik Magdon-Ismail

  • Affiliations:
  • California Institute of Technology, Pasadena, CA 91125, U.S.A.

  • Venue:
  • Neural Computation
  • Year:
  • 2000

Quantified Score

Hi-index 0.01

Visualization

Abstract

No-free-lunch theorems have shown that learning algorithms cannot be universally good. We show that no free funch exists for noise prediction as well. We show that when the noise is additive and the prior over target functions is uniform, a prior on the noise distribution cannot be updated, in the Bayesian sense, from any finite data set. We emphasize the importance of a prior over the target function in order to justify superior performance for learning systems.