Information Theory and Reliable Communication
Information Theory and Reliable Communication
Low Density Codes Achieve theRate-Distortion Bound
DCC '06 Proceedings of the Data Compression Conference
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Nonlinear sparse-graph codes for lossy compression
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Polar codes are optimal for lossy source coding
IEEE Transactions on Information Theory
Achieving the rate-distortion bound with low-density generator matrix codes
IEEE Transactions on Communications
Simple universal lossy data compression schemes derived from the Lempel-Ziv algorithm
IEEE Transactions on Information Theory
Linear-time encodable and decodable error-correcting codes
IEEE Transactions on Information Theory - Part 1
The redundancy of source coding with a fidelity criterion. 1. Known statistics
IEEE Transactions on Information Theory
On the role of pattern matching in information theory
IEEE Transactions on Information Theory
An implementable lossy version of the Lempel-Ziv algorithm. I. Optimality for memoryless sources
IEEE Transactions on Information Theory
Efficient erasure correcting codes
IEEE Transactions on Information Theory
Error exponent for source coding with a fidelity criterion
IEEE Transactions on Information Theory
A coding theorem for lossy data compression by LDPC codes
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Complexity-compression tradeoffs in lossy compression via efficient random codebooks and databases
Problems of Information Transmission
Hi-index | 0.00 |
We present several results related to the complexity-performance tradeoff in lossy compression. The first result shows that for a memoryless source with rate-distortion function R(D) and a bounded distortion measure, the rate-distortion point (R(D) + 驴, D + 驴) can be achieved with constant decompression time per (separable) symbol and compression time per symbol proportional to $$\left( {{{\lambda _1 } \mathord{\left/ {\vphantom {{\lambda _1 } \varepsilon }} \right. \kern-\nulldelimiterspace} \varepsilon }} \right)^{{{\lambda _2 } \mathord{\left/ {\vphantom {{\lambda _2 } {\gamma ^2 }}} \right. \kern-\nulldelimiterspace} {\gamma ^2 }}}$$ , where 驴 1 and 驴 2 are source dependent constants. The second result establishes that the same point can be achieved with constant decompression time and compression time per symbol proportional to $$\left( {{{\rho _1 } \mathord{\left/ {\vphantom {{\rho _1 } \gamma }} \right. \kern-\nulldelimiterspace} \gamma }} \right)^{{{\rho _2 } \mathord{\left/ {\vphantom {{\rho _2 } {\varepsilon ^2 }}} \right. \kern-\nulldelimiterspace} {\varepsilon ^2 }}}$$ . These results imply, for any function g(n) that increases without bound arbitrarily slowly, the existence of a sequence of lossy compression schemes of blocklength n with O(ng(n)) compression complexity and O(n) decompression complexity that achieve the point (R(D), D) asymptotically with increasing blocklength. We also establish that if the reproduction alphabet is finite, then for any given R there exists a universal lossy compression scheme with O(ng(n)) compression complexity and O(n) decompression complexity that achieves the point (R, D(R)) asymptotically for any stationary ergodic source with distortion-rate function D(·).