Stability results for the reconstruction of binary pictures from two projections
Image and Vision Computing
Analysis on the strip-based projection model for discrete tomography
Discrete Applied Mathematics
On the index of Siegel grids and its application to the tomography of quasicrystals
European Journal of Combinatorics
An Algebraic Framework for Discrete Tomography: Revealing the Structure of Dependencies
SIAM Journal on Discrete Mathematics
Bounds on the difference between reconstructions in binary tomography
DGCI'11 Proceedings of the 16th IAPR international conference on Discrete geometry for computer imagery
Uniqueness in Discrete Tomography: Three Remarks and a Corollary
SIAM Journal on Discrete Mathematics
A method for feature detection in binary tomography
DGCI'13 Proceedings of the 17th IAPR international conference on Discrete Geometry for Computer Imagery
Discrete tomography for inscribable lattice sets
Discrete Applied Mathematics
Bounds on the quality of reconstructed images in binary tomography
Discrete Applied Mathematics
Hi-index | 0.00 |
The task of reconstructing binary images from the knowledge of their line sums (discrete X-rays) in a given finite number m of directions is ill-posed. Even some small noise in the physical measurements can lead to dramatically different yet still unique solutions. The present paper addresses in particular the following problems. Does discrete tomography have the power of error correction? Can noise be compensated by taking more X-ray images, and, if so, what is the quantitative effect of taking one more X-ray? Our main theorem gives the first nontrivial unconditioned (and best possible) stability result. In particular, we show that the Hamming distance between any two different sets of m X-ray images of the same cardinality is at least 2(m-1), and this is best possible. As a consequence, this result implies a Rényi-type theorem for denoising and shows that the noise compensating effect of X-rays is linear in their number. Our theoretical results are complemented by determining the computational complexity of some underlying algorithmic tasks. In particular, we show that while there always is a certain inherent stability, the possibility of making (worst-case) efficient use of it is rather limited.