Elements of information theory
Elements of information theory
A mathematical theory of communication
ACM SIGMOBILE Mobile Computing and Communications Review
Introduction to Information Theory and Data Compression
Introduction to Information Theory and Data Compression
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Variable-length Codes for Data Compression
Variable-length Codes for Data Compression
Numerical Methods in Scientific Computing: Volume 1
Numerical Methods in Scientific Computing: Volume 1
Handbook of Data Compression
Information Sciences: an International Journal
Fast decoding algorithms for variable-lengths codes
Information Sciences: an International Journal
A fast and efficient nearly-optimal adaptive Fano coding scheme
Information Sciences: an International Journal
Convex half-quadratic criteria and interacting auxiliary variables for image restoration
IEEE Transactions on Image Processing
Hi-index | 0.07 |
As we know, the length of binary code of a point x@?R (with accuracy h0) is approximately m"h(x)~log"2max1,xh. We will consider the problem where we should translate the origin a of the coordinate system so that the mean amount of bits needed to code a randomly chosen element from a realization of a random variable X is minimal. In other words, we want to find a@?R such thatR@?a-E(m"h(X-a))attains minimum. We show that under reasonable assumptions the choice of a does not depend on h asymptotically. Consequently, we reduce the problem to finding the minimum of functionR@?a-@!"Rln(|x-a|)f(x)dx,where f is the density distribution of the random variable X. Moreover, we provide constructive approach for determining a.