Convex Optimization
Decentralized compression and predistribution via randomized gossiping
Proceedings of the 5th international conference on Information processing in sensor networks
Journal of Cognitive Neuroscience
Distributed Estimation Using Reduced-Dimensionality Sensor Observations
IEEE Transactions on Signal Processing
Linear Coherent Decentralized Estimation
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Polynomial time algorithms for multicast network code construction
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The Distributed Karhunen–Loève Transform
IEEE Transactions on Information Theory
Joint Source–Channel Communication for Distributed Estimation in Sensor Networks
IEEE Transactions on Information Theory
Recursive implementation of the distributed Karhunen-Loève transform
IEEE Transactions on Signal Processing
Practical data compression in wireless sensor networks: A survey
Journal of Network and Computer Applications
Hi-index | 0.00 |
A linear compressive network (LCN) is defined as a graph of sensors in which each encoding sensor compresses incoming jointly Gaussian random signals and transmits (potentially) low-dimensional linear projections to neighbors over a noisy uncoded channel. Each sensor has a maximum power to allocate over signal subspaces. The networks of focus are acyclic, directed graphs with multiple sources and multiple destinations. LCN pathways lead to decoding leaf nodes that estimate linear functions of the original high dimensional sources by minimizing a mean squared error (MSE) distortion cost function. An iterative optimization of local compressive matrices for all graph nodes is developed using an optimal quadratically constrained quadratic program (QCQP) step. The performance of the optimization is marked by power-compression-distortion spectra, with converse bounds based on cut-set arguments. Exampies include single layer and multi-layer (e.g. p-layer tree cascades, butterfly) networks. The LCN is a generalization of the Karhunen-Loève Transform to noisy multi-layer networks, and extends previous approaches for point-to-point and distributed compression-estimation of Gaussian signals. The framework relates to network coding in the noiseless case, and uncoded transmission in the noisy case.