Error scaling laws for linear optimal estimation from relative measurements

  • Authors:
  • Prabir Barooah;João P. Hespanha

  • Affiliations:
  • Department of Mechanical and Aerospace Engineering, University of Florida, Gainesville, FL;Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2009

Quantified Score

Hi-index 754.84

Visualization

Abstract

In this paper, we study the problem of estimating vector-valued variables from noisy "relative" measurements, which arises in sensor network applications. The problem can be posed in terms of a graph, whose nodes correspond to variables and edges to noisy measurements of the difference between two variables. The optimal (minimum variance) linear unbiased estimate of the node variables, with an arbitrary variable as the reference, is considered. This paper investigates how the variance of the estimation error of a node variable grows with the distance of the node to the reference node. A classification of graphs, namely, dense or sparse in Rd, 1 ≤ d ≤ 3, is established that determines this growth rate. In particular, if a graph is dense in 1-D, 2-D, or 3-D, a node variable's estimation error is upper bounded by a linear, logarithmic, or bounded function of distance from the reference. Corresponding lower bounds are obtained if the graph is sparse in 1-D, 2-D, and 3-D. These results show that naive measures of graph density, such as node degree, are inadequate predictors of the estimation error. Being true for the optimal linear unbiased estimate, these scaling laws determine algorithm-independent limits on the estimation accuracy achievable in large graphs.