Lower limits of discrete universal denoising

  • Authors:
  • Krishnamurthy Viswanathan;Erik Ordentlich

  • Affiliations:
  • Hewlett-Packard Laboratories, Palo Alto, CA;Hewlett-Packard Laboratories, Palo Alto, CA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2009

Quantified Score

Hi-index 754.84

Visualization

Abstract

In the spirit of results on universal compression, we compare the performance of universal denoisers on discrete memoryless channels to that of the best performance obtained by an omniscient kth-order sliding-window denoiser, namely, one that is tuned to the transmitted noiseless sequence. We show that the additional loss incurred in the worst case by any universal denoiser on a length-n sequence grows at least like Ω(ck/√n), where c is a constant depending on the channel parameters and the loss function. This shows that for fixed k the additional loss incurred by the Discrete Universal Denoiser (DUDE) is no larger than a constant multiplicative factor of the best possible. Furthermore, we compare universal denoisers to denoisers that are aware of the distribution of the transmitted noiseless sequence. We show that, even for this weaker target loss, for any universal denoiser there exists some distribution for the noiseless sequence corresponding to a sequence of independent and identically distributed (i.i.d.) random variables whose optimum expected loss is lower than that incurred by the universal denoiser by Ω(1/√n).