Lower Bounds for Bayes Error Estimation

  • Authors:
  • András Antos;Luc Devroye;László Györfi

  • Affiliations:
  • Computer and Automation Research Institute of the Hungarian Academy of Sciences, Budapest, Hungary;McGill Univ., Montreal, Canada;Technical Univ. of Budapest, Budapest, Hungary

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 1999

Quantified Score

Hi-index 0.14

Visualization

Abstract

We give a short proof of the following result. Let $(X,Y)$ be any distribution on ${\cal N} \times \{0,1\}$, and let $(X_1,Y_1),\ldots,(X_n,Y_n)$ be an i.i.d. sample drawn from this distribution. In discrimination, the Bayes error $L^* = \inf_g {\bf P}\{g(X) \not= Y \}$ is of crucial importance. Here we show that without further conditions on the distribution of $(X,Y)$, no rate-of-convergence results can be obtained. Let $\phi_n (X_1,Y_1,\ldots,X_n,Y_n)$ be an estimate of the Bayes error, and let $\{ \phi_n(.) \}$ beasequence of such estimates. For any sequence $\{a_n\}$ of positive numbers converging to zero, a distribution of $(X,Y)$ may be found such that ${\bf E} \left\{ | L^* - \phi_n (X_1,Y_1,\ldots,X_n,Y_n) | \right\} \ge a_n$ infinitely often.