Probability, random processes, and estimation theory for engineers
Probability, random processes, and estimation theory for engineers
The Design and Use of Steerable Filters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Elements of information theory
Elements of information theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
A Parametric Texture Model Based on Joint Statistics of Complex Wavelet Coefficients
International Journal of Computer Vision - Special issue on statistical and computational theories of vision: modeling, learning, sampling and computing, Part I
Transform Coding of Images
JPEG 2000: Image Compression Fundamentals, Standards and Practice
JPEG 2000: Image Compression Fundamentals, Standards and Practice
Digital Image Restoration
On the Convergence of Pattern Search Algorithms
SIAM Journal on Optimization
The steerable pyramid: a flexible architecture for multi-scale derivative computation
ICIP '95 Proceedings of the 1995 International Conference on Image Processing (Vol. 3)-Volume 3 - Volume 3
Comparison of model selection for regression
Neural Computation
A tutorial on support vector regression
Statistics and Computing
The Journal of Machine Learning Research
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
Semi-supervised protein classification using cluster kernels
Bioinformatics
Image Denoising with k-nearest Neighbor and Support Vector Regression
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Step Size Adaptation in Reproducing Kernel Hilbert Space
The Journal of Machine Learning Research
On the Suitable Domain for SVM Training in Image Coding
The Journal of Machine Learning Research
Removal of correlated noise by modeling the signal of interest in the wavelet domain
IEEE Transactions on Image Processing
Image compression via joint statistical characterization in the wavelet domain
IEEE Transactions on Image Processing
Wavelet-based image estimation: an empirical Bayes approach using Jeffrey's noninformative prior
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
Image denoising using scale mixtures of Gaussians in the wavelet domain
IEEE Transactions on Image Processing
Image quality assessment: from error visibility to structural similarity
IEEE Transactions on Image Processing
Nonlinear image representation for efficient perceptual coding
IEEE Transactions on Image Processing
Regularization operators for natural images based on nonlinear perception models
IEEE Transactions on Image Processing
Kernel Regression for Image Processing and Reconstruction
IEEE Transactions on Image Processing
Image Denoising by Sparse 3-D Transform-Domain Collaborative Filtering
IEEE Transactions on Image Processing
IEEE Transactions on Neural Networks
Linear dependency between ε and the input noise in ε-support vector regression
IEEE Transactions on Neural Networks
Perceptual adaptive insensitivity for support vector machine image coding
IEEE Transactions on Neural Networks
Neuro fuzzy and punctual kriging based filter for image restoration
Applied Soft Computing
Hi-index | 0.00 |
A successful class of image denoising methods is based on Bayesian approaches working in wavelet representations. The performance of these methods improves when relations among the local frequency coefficients are explicitly included. However, in these techniques, analytical estimates can be obtained only for particular combinations of analytical models of signal and noise, thus precluding its straightforward extension to deal with other arbitrary noise sources. In this paper, we propose an alternative non-explicit way to take into account the relations among natural image wavelet coefficients for denoising: we use support vector regression (SVR) in the wavelet domain to enforce these relations in the estimated signal. Since relations among the coefficients are specific to the signal, the regularization property of SVR is exploited to remove the noise, which does not share this feature. The specific signal relations are encoded in an anisotropic kernel obtained from mutual information measures computed on a representative image database. In the proposed scheme, training considers minimizing the Kullback-Leibler divergence (KLD) between the estimated and actual probability functions (or histograms) of signal and noise in order to enforce similarity up to the higher (computationally estimable) order. Due to its non-parametric nature, the method can eventually cope with different noise sources without the need of an explicit re-formulation, as it is strictly necessary under parametric Bayesian formalisms. Results under several noise levels and noise sources show that: (1) the proposed method outperforms conventional wavelet methods that assume coefficient independence, (2) it is similar to state-of-the-art methods that do explicitly include these relations when the noise source is Gaussian, and (3) it gives better numerical and visual performance when more complex, realistic noise sources are considered. Therefore, the proposed machine learning approach can be seen as a more flexible (model-free) alternative to the explicit description of wavelet coefficient relations for image denoising.