A locally adaptive data compression scheme
Communications of the ACM
Design and analysis of dynamic Huffman codes
Journal of the ACM (JACM)
ACM Computing Surveys (CSUR)
Systolic implementations of a move-to-front text compressor
SPAA '89 Proceedings of the first annual ACM symposium on Parallel algorithms and architectures
ACM Computing Surveys (CSUR)
Systolic implementations of a move-to-front text compressor
ACM SIGARCH Computer Architecture News - Symposium on parallel algorithms and architectures
Practical dictionary management for hardware data compression
Communications of the ACM
Proceedings of the ACM SIGPLAN 1997 conference on Programming language design and implementation
Splay trees for data compression
Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms
A Universal Statistical Test for Random Bit Generators
CRYPTO '90 Proceedings of the 10th Annual International Cryptology Conference on Advances in Cryptology
DCC '00 Proceedings of the Conference on Data Compression
Move-to-Front and Inversion Coding
DCC '00 Proceedings of the Conference on Data Compression
“Book Stack” as a New Statistical Test for Random Numbers
Problems of Information Transmission
Antisequential Suffix Sorting for BWT-Based Data Compression
IEEE Transactions on Computers
Squeezing succinct data structures into entropy bounds
SODA '06 Proceedings of the seventeenth annual ACM-SIAM symposium on Discrete algorithm
Squarepants in a tree: Sum of subtree clustering and hyperbolic pants decomposition
ACM Transactions on Algorithms (TALG)
Post BWT stages of the Burrows–Wheeler compression algorithm
Software—Practice & Experience
Hi-index | 754.87 |
In the schemes presented the encoder maps each message into a codeword in a prefix-free codeword set. In interval encoding the codeword is indexed by the interval since the last previous occurrence of that message, and the codeword set must be countably infinite. In recency rank encoding the codeword is indexed by the number of distinct messages in that interval, and there must be no fewer codewords than messages. The decoder decodes each codeword on receipt. Users need not know message probabilities, but must agree on indexings, of the codeword set in an order of increasing length and of the message set in some arbitrary order. The average codeword length over a communications bout is never much larger than the value for an off-line scheme which maps thejth most frequent message in the bout into thejth shortest codeword in the given set, and is never too much larger than the value for off-line Huffman encoding of messages into the best codeword set for the bout message frequencies. Both schemes can do much better than Huffman coding when successive selections of each message type cluster much more than in the independent case.