Distortion outage minimization in rayleigh fading using limited feedback

  • Authors:
  • Chih-Hong Wang;Subhrakanti Dey

  • Affiliations:
  • University of Melbourne, Parkville, Victoria;University of Melbourne, Parkville, Victoria

  • Venue:
  • GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we investigate the problem of distortion outage minimization in a clustered sensor network where sensors within each cluster send their noisy measurements of a random Gaussian source to their respective clusterheads (CH) using analog forwarding and a non-orthogonal multi-access scheme under the assumption of perfect distributed beamforming. The CHs then amplify and forward their measurements to a remote fusion center over orthogonal Rayleigh distributed block-fading channels. Due to fading, the distortion between the true value of the random source and its reconstructed estimate at the fusion center becomes a random process. Motivated by delay-limited applications, we seek to minimize the probability that the distortion exceeds a certain threshold (called the "distortion outage" probability) by optimally allocating transmit powers to the CHs. In general, the outage minimizing optimal power allocation for the CH transmitters requires full instantaneous channel state information (CSI) at the transmitters, which is difficult to obtain in practice. The novelty of this paper lies in designing locally optimal and sub-optimal power allocation algorithms which are simple to implement, using limited channel feedback where the fusion center broadcasts only a few bits of feedback to the CHs. Numerical results illustrate that a few bits of feedback provide significant improvement over no CSI and only 6-8 bits of feedback result in outages that are reasonably close to the full CSI performance for a 6-cluster sensor network. We also present results using a simultaneous perturbation stochastic approximation (SPSA) based optimization algorithm that provides further improvements in outage performance but at the cost of a much greater computational complexity.