Progressive inter-scale and intra-scale non-blind image deconvolution
ACM SIGGRAPH 2008 papers
Sparse reconstruction by separable approximation
IEEE Transactions on Signal Processing
Dictionary learning for sparse approximations with the majorization method
IEEE Transactions on Signal Processing
Bayesian image reconstruction for improving detection performance of muon tomography
IEEE Transactions on Image Processing
A fast multilevel algorithm for wavelet-regularized image restoration
IEEE Transactions on Image Processing
Restricted isometry constants where lpsparse recovery can fail for 0
IEEE Transactions on Information Theory
Restoration of images corrupted by Gaussian and uniform impulsive noise
Pattern Recognition
A subband adaptive iterative shrinkage/thresholding algorithm
IEEE Transactions on Signal Processing
Image restoration using a sparse quadtree decomposition representation
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Image restoration through L0 analysis-based sparse optimization in tight frames
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Fast image recovery using variable splitting and constrained optimization
IEEE Transactions on Image Processing
A unified optimization framework for robust pseudo-relevance feedback algorithms
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Super-Linear Convergence of Dual Augmented Lagrangian Algorithm for Sparsity Regularized Estimation
The Journal of Machine Learning Research
Joint MAP estimation for blind deconvolution: when does it work?
Proceedings of the Eighth Indian Conference on Computer Vision, Graphics and Image Processing
Exact Histogram Specification for Digital Images Using a Variational Approach
Journal of Mathematical Imaging and Vision
A boundary condition based deconvolution framework for image deblurring
Journal of Computational and Applied Mathematics
Hi-index | 0.08 |
Standard formulations of image/signal deconvolution under wavelet-based priors/regularizers lead to very high-dimensional optimization problems involving the following difficulties: the non-Gaussian (heavy-tailed) wavelet priors lead to objective functions which are nonquadratic, usually nondifferentiable, and sometimes even nonconvex; the presence of the convolution operator destroys the separability which underlies the simplicity of wavelet-based denoising. This paper presents a unified view of several recently proposed algorithms for handling this class of optimization problems, placing them in a common majorization-minimization (MM) framework. One of the classes of algorithms considered (when using quadratic bounds on nondifferentiable log-priors) shares the infamous ??singularity issue?? (SI) of ??iteratively re weighted least squares?? (IRLS) algorithms: the possibility of having to handle infinite weights, which may cause both numerical and convergence issues. In this paper, we prove several new results which strongly support the claim that the SI does not compromise the usefulness of this class of algorithms. Exploiting the unified MM perspective, we introduce a new algorithm, resulting from using bounds for nonconvex regularizers; the experiments confirm the superior performance of this method, when compared to the one based on quadratic majorization. Finally, an experimental comparison of the several algorithms, reveals their relative merits for different standard types of scenarios.