Journal of Algorithms
Design and analysis of dynamic Huffman codes
Journal of the ACM (JACM)
Analysis of arithmetic coding for data compression
Information Processing and Management: an International Journal - Special issue on data compression for images and texts
An improved data structure for cumulative probability tables
Software—Practice & Experience
Bounding the compression loss of the FGK algorithm
Journal of Algorithms
An analysis of the Burrows—Wheeler transform
Journal of the ACM (JACM)
Huffman coding with unequal letter costs
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Compression and Coding Algorithms
Compression and Coding Algorithms
The sound of silence: guessing games for saving energy in a mobile environment
Journal of Parallel and Distributed Computing - Special issue on wireless networks
Generalized Shannon Code Minimizes the Maximal Redundancy
LATIN '02 Proceedings of the 5th Latin American Symposium on Theoretical Informatics
Length-Restricted Coding in Static and Dynamic Frameworks
DCC '01 Proceedings of the Data Compression Conference
Dynamic asymmetric communication
SIROCCO'06 Proceedings of the 13th international conference on Structural Information and Communication Complexity
A fast and efficient nearly-optimal adaptive Fano coding scheme
Information Sciences: an International Journal
A dynamic programming algorithm for constructing optimal prefix-free codes with unequal letter costs
IEEE Transactions on Information Theory
Huffman codes and self-information
IEEE Transactions on Information Theory
Variations on a theme by Huffman
IEEE Transactions on Information Theory
Worst-Case Optimal Adaptive Prefix Coding
WADS '09 Proceedings of the 11th International Symposium on Algorithms and Data Structures
Minimax trees in linear time with applications
European Journal of Combinatorics
Hi-index | 0.89 |
We present the first algorithm for one-pass instantaneous coding which, given @?0 and a string S of length m over an alphabet of size n, is guaranteed to encode S using at most (H+1+1(2^@?-1)ln2)m+O(nlogm) bits (H denotes the 0th-order empirical entropy of S), a proportional length of time, and no codewords longer than logn+@?+1 bits.