Image denoising via lossy compression and wavelet thresholding
ICIP '97 Proceedings of the 1997 International Conference on Image Processing (ICIP '97) 3-Volume Set-Volume 1 - Volume 1
Covering a Ball with Smaller Equal Balls in ℝn
Discrete & Computational Geometry
An Introduction to Kolmogorov Complexity and Its Applications
An Introduction to Kolmogorov Complexity and Its Applications
IEEE Transactions on Computers
Filtering random noise from deterministic signals via datacompression
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Kolmogorov's structure functions and model selection
IEEE Transactions on Information Theory
An information theoretic representation of agent dynamics as set intersections
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Game arguments in computability theory and algorithmic information theory
CiE'12 Proceedings of the 8th Turing Centenary conference on Computability in Europe: how the world computes
Hi-index | 754.84 |
We examine the structure of families of distortion balls from the perspective of Kolmogorov complexity. Special attention is paid to the canonical rate-distortion function of a source word which returns the minimal Kolmogorov complexity of all distortion balls containing that word subject to a bound on their cardinality. This canonical rate-distortion function is related to the more standard algorithmic rate-distortion function for the given distortion measure. Examples are given of list distortion, Hamming distortion, and Euclidean distortion. The algorithmic rate-distortion function can behave differently from Shannon's rate-distortion function. To this end, we show that the canonical rate-distortion function can and does assume a wide class of shapes (unlike Shannon's); we relate low algorithmic mutual information to low Kolmogorov complexity (and consequently suggest that certain aspects of the mutual information formulation of Shannon's rate-distortion function behave differently than would an analogous formulation using algorithmic mutual information); we explore the notion that low Kolmogorov complexity distortion balls containing a given word capture the interesting properties of that word (which is hard to formalize in Shannon's theory) and this suggests an approach to denoising.