On the interdependence of routing and data compression in multi-hop sensor networks
Proceedings of the 8th annual international conference on Mobile computing and networking
Single-Bit Oversampled A/D Conversion with Exponential Accuracy in the Bit-Rate
DCC '00 Proceedings of the Conference on Data Compression
Data-gathering wireless sensor networks: organization and capacity
Computer Networks: The International Journal of Computer and Telecommunications Networking - Special issue: Wireless sensor networks
On distributed sampling of smooth non-bandlimited fields
Proceedings of the 3rd international symposium on Information processing in sensor networks
Proceedings of the 3rd international symposium on Information processing in sensor networks
Distributed Source Coding in Dense Sensor Networks
DCC '05 Proceedings of the Data Compression Conference
Markov random processes are neither bandlimited nor recoverable from samples or after quantization
IEEE Transactions on Information Theory
IPSN'03 Proceedings of the 2nd international conference on Information processing in sensor networks
Distributed sampling for dense sensor networks: a "Bit-conservation principle"
IPSN'03 Proceedings of the 2nd international conference on Information processing in sensor networks
Error-rate characteristics of oversampled analog-to-digital conversion
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On simple oversampled A/D conversion in L2(R)
IEEE Transactions on Information Theory
Low-resolution scalar quantization for Gaussian sources and squared error
IEEE Transactions on Information Theory
Markov random processes are neither bandlimited nor recoverable from samples or after quantization
IEEE Transactions on Information Theory
Hi-index | 754.90 |
This paper considers the entropy of highly correlated quantized samples. Two results are shown. The first concerns sampling and identically scalar quantizing a stationary continuoustime random process over a finite interval. It is shown that if the process crosses a quantization threshold with positive probability, then the joint entropy of the quantized samples tends to infinity as the sampling rate goes to infinity. The second result provides an upper bound to the rate at which the joint entropy tends to infinity, in the case of an infinite-level uniform threshold scalar quantizer and a stationary Gaussian random process. Specifically, an asymptotic formula for the conditional entropy of one quantized sample conditioned on the previous quantized sample is derived. At high sampling rates, these results indicate a sharp contrast between the large encoding rate (in bits/sec) required by a lossy source code consisting of a fixed scalar quantizer and an ideal, sampling-rateadapted lossless code, and the bounded encoding rate required by an ideal lossy source code operating at the same distortion.