Asymptotic analysis of error probabilities for the nonzero-mean Gaussian hypothesis testing problem

  • Authors:
  • R. K. Bahr

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Arizona Univ., Tucson, AZ

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

Using a large-deviation theory approach, the rate at which the probability of detection error vanishes as sample size increases in the testing of nonzero-mean Gaussian stochastic processes is studied. After suitable transformation, the likelihood ratio test statistic is expressed as a sum of independent Gaussian random variables. The precise asymptotic rate at which the tail probability of this sum vanishes is derived by use of Ellis' theorem in conjunction with asymptotic analysis of Toeplitz matrices. As a specific example, a signal composed of a deterministic mean component, a zero-mean stochastic component, and a white-noise background was tested against white noise alone. Results confirm the obvious: for fixed stochastic signal power the rate of error decrease increases as the power in the deterministic mean increases. With higher signal-to-noise values, the probability of error must vanish more quickly. For fixed deterministic mean component, as the stochastic signal power increases there is curious dip in the rate of error decrease; however, as this power is increased, eventually the rate of error decrease increases