Journal of Algorithms
ACM Computing Surveys (CSUR)
Dr. Dobb's Journal
Text compression
Algorithm 673: Dynamic Huffman coding
ACM Transactions on Mathematical Software (TOMS)
Arithmetic coding for data compression
Communications of the ACM
Data and image compression (4th ed.): tools and techniques
Data and image compression (4th ed.): tools and techniques
The scientist and engineer's guide to digital signal processing
The scientist and engineer's guide to digital signal processing
Introduction to data compression (2nd ed.)
Introduction to data compression (2nd ed.)
Computer Networks
An adaptive character wordlength algorithm for data compression
Computers & Mathematics with Applications
A fast and efficient nearly-optimal adaptive Fano coding scheme
Information Sciences: an International Journal
A universal algorithm for sequential data compression
IEEE Transactions on Information Theory
Compression of individual sequences via variable-rate coding
IEEE Transactions on Information Theory
Embedding secret messages based on chaotic map and Huffman coding
Proceedings of the 3rd International Conference on Ubiquitous Information Management and Communication
International Journal of Speech Technology
A technique of embedding digital data in an image compression code reversibly
Proceedings of the 4th International Conference on Uniquitous Information Management and Communication
A compression-based text steganography method
Journal of Systems and Software
Development of a Novel Compressed Index-Query Web Search Engine Model
International Journal of Information Technology and Web Engineering
Hi-index | 0.09 |
This paper introduces a novel lossless binary data compression scheme that is based on the error correcting Hamming codes, namely the HCDC scheme. In this scheme, the binary sequence to be compressed is divided into blocks of n bits length. To utilize the Hamming codes, the block is considered as a Hamming codeword that consists of p parity bits and d data bits (n=d+p). Then each block is tested to find if it is a valid or a non-valid Hamming codeword. For a valid block, only the d data bits preceded by 1 are written to the compressed file, while for a non-valid block all n bits preceded by 0 are written to the compressed file. These additional 1 and 0 bits are used to distinguish the valid and the non-valid blocks during the decompression process. An analytical formula is derived for computing the compression ratio as a function of block size, and fraction of valid data blocks in the sequence. The performance of the HCDC scheme is analyzed, and the results obtained are presented in tables and graphs. Finally, conclusions and recommendations for future works are pointed out.