Dr. Dobb's Journal
A parallel architecture for high-speed data compression
Journal of Parallel and Distributed Computing
Information Sciences: an International Journal - Dictionary based compression
Fundamental Data Compression
An efficient method to calculate the free distance of convolutional codes
EHAC'07 Proceedings of the 6th WSEAS International Conference on Electronics, Hardware, Wireless and Optical Communications
Compression of small text files
Advanced Engineering Informatics
Parallelization of prime number generation using message passing interface
WSEAS Transactions on Computers
Hi-index | 0.00 |
When the omnipresent challenge of space saving reaches its full potential so that a file cannot be compressed any more, a new question arises: "How can we improve our compression even more?". The answer is obvious: "Let's speed it up!". This article tries to find the meeting point of space saving and compression time reduction. That reduction is based on a theory in which a task can be broken into smaller subtasks which are simultaneously compressed and then joined together. Five different compression algorithms are used two of which are entropy coders and three are dictionary coders. Individual analysis for every compression algorithm is given and in the end compression algorithms are compared by performance and speed depending on the number of cores used. To summarize the work, a speedup diagram is given to behold if Mr. Amdahl and Mr. Gustafson were right.