Dynamic load balancing for distributed memory multiprocessors
Journal of Parallel and Distributed Computing
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Distributed average consensus with least-mean-square deviation
Journal of Parallel and Distributed Computing
Differential nested lattice encoding for consensus problems
Proceedings of the 6th international conference on Information processing in sensor networks
Automatica (Journal of IFAC)
Distributed Average Consensus using Probabilistic Quantization
SSP '07 Proceedings of the 2007 IEEE/SP 14th Workshop on Statistical Signal Processing
IEEE Transactions on Signal Processing
A theory of nonsubtractive dither
IEEE Transactions on Signal Processing
Topology for Distributed Inference on Graphs
IEEE Transactions on Signal Processing
Distributed Average Consensus With Dithered Quantization
IEEE Transactions on Signal Processing - Part I
Asymptotic noise analysis of high dimensional consensus
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
Weight optimization for consensus algorithms with correlated switching topology
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Brief paper: Distributed averaging on digital erasure networks
Automatica (Journal of IFAC)
Brief paper: H∞ filtering with randomly occurring sensor saturations and missing measurements
Automatica (Journal of IFAC)
Consensus seeking over directed networks with limited information communication
Automatica (Journal of IFAC)
Consensus of sampled-data multi-agent networking systems via model predictive control
Automatica (Journal of IFAC)
Hi-index | 35.70 |
The paper studies the problem of distributed average consensus in sensor networks with quantized data and random link failures. To achieve consensus, dither (small noise) is added to the sensor states before quantization. When the quantizer range is unbounded (countable number of quantizer levels), stochastic approximation shows that consensus is asymptotically achieved with probability one and in mean square to a finite random variable. We show that the mean-squared error (mse) can be made arbitrarily small by tuning the link weight sequence, at a cost of the convergence rate of the algorithm. To study dithered consensus with random links when the range of the quantizer is bounded, we establish uniform boundedness of the sample paths of the unbounded quantizer. This requires characterization of the statistical properties of the supremum taken over the sample paths of the state of the quantizer. This is accomplished by splitting the state vector of the quantizer in two components: one along the consensus subspace and the other along the subspace orthogonal to the consensus subspace. The proofs use maximal inequalities for submartingale and supermartingale sequences. From these, we derive probability bounds on the excursions of the two subsequences, from which probability bounds on the excursions of the quantizer state vector follow. The paper shows how to use these probability bounds to design the quantizer parameters and to explore tradeoffs among the number of quantizer levels, the size of the quantization steps, the desired probability of saturation, and the desired level of accuracy ?? away from consensus. Finally, the paper illustrates the quantizer design with a numerical study.