Introduction to Information Theory and Data Compression

  • Authors:
  • Darrel Hankerson;Peter D. Johnson;Greg A. Harris

  • Affiliations:
  • -;-;-

  • Venue:
  • Introduction to Information Theory and Data Compression
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

From the Publisher:An effective blend of carefully explained theory and practicalapplications, this book has been written to offer access to the basics of information theory and data compression. The authors have applied their experience in teaching information theory and data compression to the careful preparation and unique organization of this one-of-a-kind text. This pioneering textbook serves two independent courses-in information theory and in data compression-and also proves valuable for independent study and as a reference. Its treatment of information theory, while theoretical and "academic," is pitched at an elementary level with computational examples and exercises. The treatment of data compression is from a practical, engineering perspective. It fully explains, illustrates, and provides numerous exercises for a number of different types of techniques. The text covers lossless source-text methods most comprehensively, but also includes a long, final chapter on lossy compression-emphasizing transform methods on images-that could serve as the basis of a separate course. Introduction to Information Theory and Data Compression unobtrusively provides the application of information theory to forming and answering theoretical questions in data compression. Those curious about data compression-with no interest in information theory- and vice versa, can use the text profitably, as can those with a strong curiosity about the connections between the two areas. Elementary Probability Introduction Events Conditional Probability Independence Bernoulli Trials An Elementary Counting Principle On Drawing without Replacement Random Variables and Expected, or Average, Value The Law of LargeNumbers Information and Entropy Systems of Events Information Entropy Information and Entropy Channels and Channel Capacity Discrete Memoryless Channels Transition Probabilities and Binary Systematic Channels Input Frequencies Channel Capacity Proof of Theorem 3.4.1 Coding Theory Encoding and Decoding Prefix-Condition Codes and the Kraft-McMillan Inequality Average Code Word Length and Huffman's Algorithm Optimizing the Input Frequencies Error Correction, Maximum Likelihood Decoding, Nearest Code Word Decoding, and Reliability Shannon's Noisy Channel Theorem Error Correction with Binary Symmetric Channels and Equal Source Frequencies DATA COMPRESSION Lossless Data Compression by Replacement Schemes Replacement via Encoding Scheme Review of the Prefix Condition How to Choose an Encoding Scheme The Noiseless Coding Theorem and Shannon's Bound Arithmetic Coding Pure Zeroth-Order Arithmetic Coding: dfwld What's Good about dfwld Coding: The Compression Ratio What's Bad about dfwld Coding and Some Ways to Fix It Implementing Arithmetic Coding Notes Higher-Order Modeling Higher-Order Huffman Encoding The Shannon Bound for Higher-Order Encoding Higher-Order Arithmetic Coding Statistical Models, Statistics, and the Possibly Unknowable Truth Adaptive Methods Adaptive Huffman Encoding Maintaining the Tree in Adaptive Huffman Encoding: The Method of Knuth and Gallager Adaptive Arithmetic Coding Interval and Recency Rank Encoding Dictionary Methods LZ77(Sliding Window) Schemes The LZ78 Approach Notes Transform Methods and Image Compression Transforms Periodic Signals and the Fourier Transform The Cosine and Sine Transforms Two-Dimensional Transforms An Application: JPEG Image Compression A Brief Introduction to Wavelets Notes Appendices JPEGtool User's Guide Source Listing for LZRW1-A Resources, Patents, and Illusions Notes on the Exercises Bibliography Index