Elements of information theory
Elements of information theory
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Gaussian Markov Random Fields: Theory And Applications (Monographs on Statistics and Applied Probability)
Block matrices with L-block-banded inverse: inversion algorithms
IEEE Transactions on Signal Processing
Optimal Node Density for Detection in Energy-Constrained Random Networks
IEEE Transactions on Signal Processing - Part II
Optimal importance sampling for some quadratic forms of ARMA processes
IEEE Transactions on Information Theory - Part 2
IEEE Transactions on Information Theory
Gauss-Markov random fields (CMrf) with continuous indices
IEEE Transactions on Information Theory
Detection of stochastic processes
IEEE Transactions on Information Theory
Statistical inference under multiterminal data compression
IEEE Transactions on Information Theory
Matrices with banded inverses: inversion algorithms and factorization of Gauss-Markov processes
IEEE Transactions on Information Theory
Hypothesis testing with the general source
IEEE Transactions on Information Theory
Tree-based reparameterization framework for analysis of sum-product and related algorithms
IEEE Transactions on Information Theory
Neyman-pearson detection of gauss-Markov signals in noise: closed-form error exponentand properties
IEEE Transactions on Information Theory
Energy scaling laws for distributed inference in random fusion networks
IEEE Journal on Selected Areas in Communications - Special issue on stochastic geometry and random graphs for the analysis and designof wireless networks
Detection error exponent for spatially dependent samples in random networks
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 4
Covariance estimation in decomposable Gaussian graphical models
IEEE Transactions on Signal Processing
Balancing lifetime and classification accuracy of wireless sensor networks
Proceedings of the fourteenth ACM international symposium on Mobile ad hoc networking and computing
Hi-index | 754.84 |
The problem of hypothesis testing against independence for a Gauss-Markov random field (GMRF) is analyzed. Assuming an acyclic dependency graph, an expression for the log-likelihood ratio of detection is derived. Assuming random placement of nodes over a large region according to the Poisson or uniform distribution and nearest-neighbor dependency graph, the error exponent of the Neyman-Pearson detector is derived using large-deviations theory. The error exponent is expressed as a dependency-graph functional and the limit is evaluated through a special law of large numbers for stabilizing graph functionals. The exponent is analyzed for different values of the variance ratio and correlation. It is found that a more correlated GMRF has higher exponent at low values of the variance ratio whereas the situation is reversed at high values of the variance ratio.