Local Adaptivity to Variable Smoothness for Exemplar-Based Image Regularization and Representation
International Journal of Computer Vision
Journal of Signal Processing Systems
Adaptive algorithm for chirp-rate estimation
EURASIP Journal on Advances in Signal Processing
Adaptive 2-D wavelet transform based on the lifting scheme with preserved vanishing moments
IEEE Transactions on Image Processing
Journal of Signal Processing Systems
STFT-based estimator of polynomial phase signals
Signal Processing
Adaptive filter support selection for signal denoising based on the improved ICI rule
Digital Signal Processing
Hi-index | 35.68 |
An algorithm for the mean squared error (MSE) minimization, through the bias-to-variance ratio optimization, has been recently proposed and used in the literature. This algorithm is based on the analysis of the intersection of confidence intervals (ICIs). The algorithm does not require explicit knowledge of the estimation bias for a "near to optimal" parameter estimation. This paper presents a detailed analysis of the algorithm performances, including procedures and relations that can be used for a fine adjustment of the algorithm parameters. Reliability of the algorithm is studied for various kinds of estimation noise. Results are confirmed on a simulated example with uniform, Gaussian, and Laplacian noise. An illustration of the algorithm application on a simple filtering example is given.