Distributed Source Coding Using Syndromes (DISCUS): Design and Construction
DCC '99 Proceedings of the Conference on Data Compression
DCC '04 Proceedings of the Conference on Data Compression
Precoding and Decoding Paradigms for Distributed Vector Data Compression
IEEE Transactions on Signal Processing
Polynomial filtering for fast convergence in distributed consensus
IEEE Transactions on Signal Processing
Broadcast gossip algorithms for consensus
IEEE Transactions on Signal Processing
Distributed consensus algorithms in sensor networks: quantized data and random link failures
IEEE Transactions on Signal Processing
Convergence of consensus models with stochastic disturbances
IEEE Transactions on Information Theory
IEEE Transactions on Signal Processing
Hi-index | 0.07 |
In this paper we consider the problem of transmitting quantized data while performing an average consensus algorithm. Average consensus algorithms are protocols to compute the average value of all sensor measurements via near neighbors communications. The main motivation for our work is the observation that consensus algorithms offer the perfect example of network communications where there is an increasing correlation between the data exchanged, as the system updates its computations. Henceforth, it is possible to utilize previously exchanged data and current side information to reduce significantly the demands of quantization bit rate for a certain precision. We analyze the case of a network with a topology built as that of a random geometric graph and with links that are assumed to be reliable at a constant bit rate. Numerically we show that in consensus algorithms, increasing number of iterations does not have the effect of increasing the error variance. Thus, we conclude that noisy recursions lead to a consensus if the data correlation is exploited in the messages source encoders and decoders. We briefly state the theoretical results which are parallel to our numerical experiments.