Scale-Space and Edge Detection Using Anisotropic Diffusion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear total variation based noise removal algorithms
Proceedings of the eleventh annual international conference of the Center for Nonlinear Studies on Experimental mathematics : computational issues in nonlinear science: computational issues in nonlinear science
Variational methods in image segmentation
Variational methods in image segmentation
Iterative methods for total variation denoising
SIAM Journal on Scientific Computing - Special issue on iterative methods in numerical linear algebra; selected papers from the Colorado conference
Prior Learning and Gibbs Reaction-Diffusion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Removing Noise and Preserving Details with Relaxed Median Filters
Journal of Mathematical Imaging and Vision
A Variational Model for Image Classification and Restoration
IEEE Transactions on Pattern Analysis and Machine Intelligence
Minimax description length for signal denoising and optimized representation
IEEE Transactions on Information Theory
Behavioral analysis of anisotropic diffusion in image processing
IEEE Transactions on Image Processing
Total variation blind deconvolution
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
Image segmentation and edge enhancement with stabilized inverse diffusion equations
IEEE Transactions on Image Processing
Denoising by second order statistics
Signal Processing
Hi-index | 0.00 |
Using first principles, we establish in this paper a connection between the maximum a posteriori (MAP) estimator and the variational formulation of optimizing a given functional subject to some noise constraints. A MAP estimator which uses a Markov or a maximum entropy random field model for a prior distribution can be viewed as a minimizer of a variational problem. Using notions from robust statistics, a variational filter called Huber gradient descent flow is proposed. It yields the solution to a Huber type functional subject to some noise constraints, and the resulting filter behaves like a total variation anisotropic diffusion for large gradient magnitudes and like an isotropic diffusion for small gradient magnitudes. Using some of the gained insight, we are also able to propose an information-theoretic gradient descent flow whose functional turns out to be a compromise between a neg-entropy variational integral and a total variation. Illustrating examples demonstrate a much improved performance of the proposed filters in the presence of Gaussian and heavy tailed noise.