A context quantization approach to universal denoising

  • Authors:
  • Kamakshi Sivaramakrishnan;Tsachy Weissman

  • Affiliations:
  • Admob Inc., San Mateo, CA and Department of Electrical Engineering, Stanford University, Stanford, CA;Department of Electrical Engineering, Stanford University, Stanford, CA and Technion-Israel Institute of Technology, Haifa, Israel

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 35.74

Visualization

Abstract

We revisit the problem of denoising a discrete-time, continuous-amplitude signal corrupted by a known memoryless channel. By modifying our earlier approach to the problem, we obtain a scheme that is much more tractable than the original one and at the same time retains the universal optimality properties. The universality refers to the fact that the proposed denoiser asymptotically (with increasing block length of the data) achieves the performance of an optimum denoiser that has full knowledge of the distribution of a source generating the underlying clean sequence; the only restriction being that the distribution is stationary. The optimality, in a sense we will make precise, of the denoiser also holds in the case where the underlying clean sequence is unknown and deterministic and the only source of randomness is in the noise. The schemes involve a simple preprocessing step of quantizing the noisy symbols to generate quantized contexts. The quantized context value corresponding to each sequence component is then used to partition the unquantized symbols into subsequences. A universal symbol-by-symbol denoiser (for unquantized sequences) is then separately employed on each of the subsequences. We identify a rate at which the context length and quantization resolution should be increased so that the resulting scheme is universal. The proposed family of schemes is computationally attractive with an upper bound on complexity which is independent of the context length and the quantization resolution. Initial experimentation seems to indicate that these schemes are not only superior from a computational viewpoint, but also achieve better denoising in practice.