Dynamic load balancing for distributed memory multiprocessors
Journal of Parallel and Distributed Computing
Vector quantization and signal compression
Vector quantization and signal compression
Elements of information theory
Elements of information theory
Efficient schemes for nearest neighbor load balancing
Parallel Computing - Special issue on parallelization techniques for numerical modelling
Directed diffusion: a scalable and robust communication paradigm for sensor networks
MobiCom '00 Proceedings of the 6th annual international conference on Mobile computing and networking
Parallel and Distributed Computation: Numerical Methods
Parallel and Distributed Computation: Numerical Methods
Information Theory and Reliable Communication
Information Theory and Reliable Communication
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
Gossip-Based Computation of Aggregate Information
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Stabilizability of Stochastic Linear Systems with Finite Feedback Data Rates
SIAM Journal on Control and Optimization
Estimation and Control over Communication Networks (Control Engineering)
Estimation and Control over Communication Networks (Control Engineering)
Unequal error protection: an information-theoretic perspective
IEEE Transactions on Information Theory
Modern Coding Theory
Geographic Gossip: Efficient Averaging for Sensor Networks
IEEE Transactions on Signal Processing
The method of types [information theory]
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Why Do Block Length and Delay Behave Differently if Feedback Is Present?
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The problem of reliably transmitting a real-valued random vector through a digital noisy channel is relevant for the design of distributed estimation and control techniques over networked systems. One important example consists in the remote state estimation under communication constraints. In this case, an anytime transmission scheme consists of an encoder—which maps the real vector into a sequence of channel inputs—and a decoder—which sequentially updates its estimate of the vector as more and more channel outputs are observed. The encoder performs both source and channel coding of the data. Assuming that no channel feedback is available at the transmitter, this paper studies the rates of convergence to zero of the mean squared error. Two coding strategies are analyzed: the first one has exponential convergence rate but is expensive in terms of its encoder/decoder computational complexity, while the second one has a convenient computational complexity but subexponential convergence rate. General bounds are obtained describing the convergence properties of these classes of methods.