Model selection
The nature of statistical learning theory
The nature of statistical learning theory
Signal estimation and denoising using VC-theory
Neural Networks
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
A small sample model selection criterion based on Kullback's symmetric divergence
IEEE Transactions on Signal Processing
Wavelet thresholding via MDL for natural images
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Anytime classification for a pool of instances
Machine Learning
Hi-index | 0.08 |
We consider the determination of a soft/hard coefficients threshold for signal recovery embedded in additive Gaussian noise. This is closely related to the problem of variable selection in linear regression. Viewing the denoising problem as a model selection one, we propose a new information theoretical model selection approach to signal denoising. We first construct a statistical model for the unknown signal and then try to find the best approximating model (corresponding to the denoised signal) from a set of candidates. We adopt the Kullback's symmetric divergence as a measure of similarity between the unknown model and the candidate model. The best approximating model is the one that minimizes an unbiased estimator of this divergence. The advantage of a denoising method based on model selection over classical thresholding approaches, resides in the fact that the threshold is determined automatically without the need to estimate the noise variance. The proposed denoising method, called KICc-denoising (Kullback Information Criterion corrected) is compared with cross validation (CV), minimum description length (MDL) and the classical methods SureShrink and VisuShrink via a simulation study based on three different type of signals: chirp, seismic and piecewise polynomial.