Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
Detection of Gauss-Markov random fields with nearest-neighbor dependency
IEEE Transactions on Information Theory
Detection error exponent for spatially dependent samples in random networks
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 4
On logarithmically asymptotically optimal testing of hypotheses and identification
General Theory of Information Transfer and Combinatorics
Hi-index | 754.90 |
The general formulas for the Neyman-Pearson type-II error exponent subject to two different type-I error constraints, as indicated in the title of the correspondence, are established. As revealed in the formulas, the type-II error exponents are fully determined by the ultimate statistical characteristic of the normalized log-likelihood ratio evaluated under the null hypothesis distribution. Applications of the general formulas to distributed Neyman-Pearson detection, and the channel reliability function are also demonstrated