Applied multivariate statistical analysis
Applied multivariate statistical analysis
Elements of information theory
Elements of information theory
Randomized algorithms
The complexity of approximating entropy
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Connecting the Physical World with Pervasive Networks
IEEE Pervasive Computing
Simultaneous optimization for concave costs: single sink aggregation or single source buy-at-bulk
SODA '03 Proceedings of the fourteenth annual ACM-SIAM symposium on Discrete algorithms
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Complexity classification of network information flow problems
SODA '04 Proceedings of the fifteenth annual ACM-SIAM symposium on Discrete algorithms
The impact of spatial correlation on routing with compression in wireless sensor networks
Proceedings of the 3rd international symposium on Information processing in sensor networks
Proceedings of the 3rd international symposium on Information processing in sensor networks
Collecting correlated information from a sensor network
SODA '05 Proceedings of the sixteenth annual ACM-SIAM symposium on Discrete algorithms
On the capacity of information networks
SODA '06 Proceedings of the seventeenth annual ACM-SIAM symposium on Discrete algorithm
Separating distributed source coding from network coding
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Modeling spatially correlated data in sensor networks
ACM Transactions on Sensor Networks (TOSN)
IPSN'03 Proceedings of the 2nd international conference on Information processing in sensor networks
IEEE Transactions on Information Theory
Networked Slepian-Wolf: theory, algorithms, and scaling laws
IEEE Transactions on Information Theory
An association model of sensor properties for event diffusion spotting sensor networks
APWeb'08 Proceedings of the 10th Asia-Pacific web conference on Progress in WWW research and development
Joint coding/routing optimization for correlated sources in wireless visual sensor networks
GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
On computing compression trees for data collection in wireless sensor networks
INFOCOM'10 Proceedings of the 29th conference on Information communications
Adaptive data collection in sensor networks
WD'09 Proceedings of the 2nd IFIP conference on Wireless days
Hi-index | 0.01 |
In many energy-constrained wireless sensor networks, nodes cooperatively forward correlated sensed data to data sinks. In order to reduce the communication cost (e.g.overall energy) used for data collection, previous works have focused on specific coding schemes, such as Slepian-Wolf Code or Explicit Entropy Code. However, the minimum communication cost under arbitrary coding/routing schemes has not yet been characterized. In this paper, we consider the problem of minimizing the total communication cost of a wireless sensor network with a single sink. We prove that the minimum communication cost can be achieved using Slepian-Wolf Code and Commodity Flow Routing when the link communication cost is a convex function of link data rate. Furthermore, we find it useful to introduce a new metric distance entropy, a generalization of entropy, to characterize the data collection limit of networked sources. When the energy consumption is proportional to the link data rate (e.g.normally in 802.11), we show that distance entropy provides a lower bound of the communication cost and can be achieved by using a specific rate SWC and shortest path routing. Theoretically, achieving optimality may require global knowledge of the data correlation structure, which may not be available in practice. Therefore, we propose a simple, hierarchical scheme that primarily exploits data correlation between local neighboring nodes. We show that for several correlation structures and topologies, the communication cost achieved by this scheme is within a constant factor of the distance entropy, i.e., it is asymptotically optimal. Finally, we simulate our algorithm using radar reffectivity data as well as traces from Gaussian Markov Fields (GMF). As the network size goes large, for the radar data, we find our algorithm saves two thirds of the communication cost compared to a non-coding approach; as for the GMF data, our algorithm converges to a constant factor (1 .5_1.8) the distance entropy.