Quantifying information flow with beliefs
Journal of Computer Security - 18th IEEE Computer Security Foundations Symposium (CSF 18)
Noisy timing channels with binary inputs and outputs
IH'06 Proceedings of the 8th international conference on Information hiding
A Free Object in Quantum Information Theory
Electronic Notes in Theoretical Computer Science (ENTCS)
Pivot selection method for optimizing both pruning and balancing in metric space indexes
DEXA'10 Proceedings of the 21st international conference on Database and expert systems applications: Part II
Hi-index | 0.00 |
From the Publisher:Information theory and coding theory are two related aspects of the same problem: how to transmit information efficiently and accurately. This book provides a clear introduction to both subjects, emphasising the relationship and links between the two. The first part, concentrating on information theory, covers uniquely decodable and instantaneous codes, Huffman coding, entropy, information channels and Shannon's Fundamental Theorem. The second part, on coding theory, uses linear algebra to construct examples of error-correcting codes, such as the Hamming, Hadamard, Golay and Reed-Muller codes. The authors highlight carefully explained proofs and worked examples throughout and exercises with solutions are provided to consolidate understanding of the main concepts and techniques. Assuming only some basic probability theory and linear algebra, together with a little calculus, this book is aimed at second and third year undergraduate students in mathematics, electronics and computer science.