Journal of Mathematical Imaging and Vision
Statistical approaches in quantitative positron emissiontomography
Statistics and Computing
Restoration of astrophysical images: the case of poisson data with additive Gaussian noise
EURASIP Journal on Applied Signal Processing
Nonlocal Prior Bayesian Tomographic Reconstruction
Journal of Mathematical Imaging and Vision
A discrete mixture-based kernel for SVMs: Application to spam and image categorization
Information Processing and Management: an International Journal
A novel weighted least squares PET image reconstruction method using adaptive variable index sets
Digital Signal Processing
Deblurring Poissonian images by split Bregman techniques
Journal of Visual Communication and Image Representation
Restoration of Poissonian images using alternating direction optimization
IEEE Transactions on Image Processing
Emission image reconstruction based on incremental optimisation transfer algorithm
International Journal of Computational Science and Engineering
Hi-index | 0.01 |
Most expectation-maximization (EM) type algorithms for penalized maximum-likelihood image reconstruction converge slowly, particularly when one incorporates additive background effects such as scatter, random coincidences, dark current, or cosmic radiation. In addition, regularizing smoothness penalties (or priors) introduce parameter coupling, rendering intractable the M-steps of most EM-type algorithms. This paper presents space-alternating generalized EM (SAGE) algorithms for image reconstruction, which update the parameters sequentially using a sequence of small “hidden” data spaces, rather than simultaneously using one large complete-data space. The sequential update decouples the M-step, so the maximization can typically be performed analytically. We introduce new hidden-data spaces that are less informative than the conventional complete-data space for Poisson data and that yield significant improvements in convergence rate. This acceleration is due to statistical considerations, not numerical overrelaxation methods, so monotonic increases in the objective function are guaranteed. We provide a general global convergence proof for SAGE methods with nonnegativity constraints