Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
International Journal of Computer Vision - Special issue on statistical and computational theories of vision: modeling, learning, sampling and computing, Part I
Example-Based Super-Resolution
IEEE Computer Graphics and Applications
Limits on Super-Resolution and How to Break Them
IEEE Transactions on Pattern Analysis and Machine Intelligence
A simple test to check the optimality of a sparse signal approximation
Signal Processing - Sparse approximations in signal and image processing
On the stability of the basis pursuit in the presence of noise
Signal Processing - Sparse approximations in signal and image processing
Example-based single document image super-resolution: a global MAP approach with outlier rejection
Multidimensional Systems and Signal Processing
Denoising by sparse approximation: error bounds based on rate-distortion theory
EURASIP Journal on Applied Signal Processing
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
Linear Regression With a Sparse Parameter Vector
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Recovery of exact sparse representations in the presence of bounded noise
IEEE Transactions on Information Theory
Stable recovery of sparse overcomplete representations in the presence of noise
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries
IEEE Transactions on Image Processing
From Local Kernel to Nonlocal Multiple-Model Image Denoising
International Journal of Computer Vision
IEEE Transactions on Signal Processing
A* orthogonal matching pursuit: Best-first search for compressed sensing signal recovery
Digital Signal Processing
Matching Pursuits with random sequential subdictionaries
Signal Processing
Dictionary Learning for Noisy and Incomplete Hyperspectral Images
SIAM Journal on Imaging Sciences
On MAP and MMSE estimators for the co-sparse analysis model
Digital Signal Processing
Hi-index | 754.84 |
Cleaning of noise from signals is a classical and long-studied problem in signal processing. Algorithms for this task necessarily rely on an a priori knowledge about the signal characteristics, along with information about the noise properties. For signals that admit sparse representations over a known dictionary, a commonly used denoising technique is to seek the sparsest representation that synthesizes a signal close enough to the corrupted one. As this problem is too complex in general, approximation methods, such as greedy pursuit algorithms, are often employed. In this line of reasoning, we are led to believe that detection of the sparsest representation is key in the success of the denoising goal. Does this mean that other competitive and slightly inferior sparse representations are meaningless? Suppose we are served with a group of competing sparse representations, each claiming to explain the signal differently. Can those be fused somehow to lead to a better result? Surprisingly, the answer to this question is positive; merging these representations can form a more accurate (in the mean-squared-error (MSE) sense), yet dense, estimate of the original signal even when the latter is known to be sparse. In this paper, we demonstrate this behavior, propose a practical way to generate such a collection of representations by randomizing the Orthogonal Matching Pursuit (OMP) algorithm, and produce a clear analytical justification for the superiority of the associated Randomized OMP (RandOMP) algorithm. We show that while the maximum a posteriori probability (MAP) estimator aims to find and use the sparsest representation, the minimum mean-squared-error (MMSE) estimator leads to a fusion of representations to form its result. Thus, working with an appropriate mixture of candidate representations, we are surpassing the MAP and tending towards the MMSE estimate, and thereby getting a far more accurate estimation in terms of the expected l2-norm error.