On optimal communication cost for gathering correlated data through wireless sensor networks

  • Authors:
  • Junning Liu;Micah Adler;Don Towsley;Chun Zhang

  • Affiliations:
  • University of Massachusetts;University of Massachusetts;University of Massachusetts;IBM T.J. Watson Research Center

  • Venue:
  • Proceedings of the 12th annual international conference on Mobile computing and networking
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

In many energy-constrained wireless sensor networks, nodes cooperatively forward correlated sensed data to data sinks. In order to reduce the communication cost (e.g.overall energy) used for data collection, previous works have focused on specific coding schemes, such as Slepian-Wolf Code or Explicit Entropy Code. However, the minimum communication cost under arbitrary coding/routing schemes has not yet been characterized. In this paper, we consider the problem of minimizing the total communication cost of a wireless sensor network with a single sink. We prove that the minimum communication cost can be achieved using Slepian-Wolf Code and Commodity Flow Routing when the link communication cost is a convex function of link data rate. Furthermore, we find it useful to introduce a new metric distance entropy, a generalization of entropy, to characterize the data collection limit of networked sources. When the energy consumption is proportional to the link data rate (e.g.normally in 802.11), we show that distance entropy provides a lower bound of the communication cost and can be achieved by using a specific rate SWC and shortest path routing. Theoretically, achieving optimality may require global knowledge of the data correlation structure, which may not be available in practice. Therefore, we propose a simple, hierarchical scheme that primarily exploits data correlation between local neighboring nodes. We show that for several correlation structures and topologies, the communication cost achieved by this scheme is within a constant factor of the distance entropy, i.e., it is asymptotically optimal. Finally, we simulate our algorithm using radar reffectivity data as well as traces from Gaussian Markov Fields (GMF). As the network size goes large, for the radar data, we find our algorithm saves two thirds of the communication cost compared to a non-coding approach; as for the GMF data, our algorithm converges to a constant factor (1 .5_1.8) the distance entropy.