A locally adaptive data compression scheme
Communications of the ACM
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Compression of Low Entropy Strings with Lempel--Ziv Algorithms
SIAM Journal on Computing
Models and issues in data stream systems
Proceedings of the twenty-first ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Extended application of suffix trees to data compression
DCC '96 Proceedings of the Conference on Data Compression
Truncated suffix trees and their application to data compression
Theoretical Computer Science
Large alphabets and incompressibility
Information Processing Letters
Data streams: algorithms and applications
Foundations and Trends® in Theoretical Computer Science
Generalized kraft inequality and arithmetic coding
IBM Journal of Research and Development
The redundancy and distribution of the phrase lengths of the fixed-database Lempel-Ziv algorithm
IEEE Transactions on Information Theory
On the bit-complexity of Lempel-Ziv compression
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
On the Value of Multiple Read/Write Streams for Data Compression
CPM '09 Proceedings of the 20th Annual Symposium on Combinatorial Pattern Matching
Journal of Discrete Algorithms
Grammar-based compression in a streaming model
LATA'10 Proceedings of the 4th international conference on Language and Automata Theory and Applications
On the value of multiple read/write streams for data compression
Information Theory, Combinatorics, and Search Theory
Hi-index | 0.00 |
Compression is most important when space is in short supply, so compression algorithms are often implemented in limited memory. Most analyses ignore memory constraints as an implementation detail, however, creating a gap between theory and practice. In this paper we consider the effect of memory limitations on compression algorithms. In the first part we assume the memory available is fixed and prove nearly tight upper and lower bounds on how much memory is needed to compress a string close to its k-th order entropy. In the second part we assume the memory available grows (slowly) as more and more characters are read. In this setting we show that the rate of growth of the available memory determines the speed at which the compression ratio approaches the entropy. In particular, we establish a relationship between the rate of growth of the sliding window in the LZ77 algorithm and its convergence rate.