General Theory of Information Transfer and Combinatorics
General Theory of Information Transfer and Combinatorics
An Interpretation of Identification Entropy
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We provide two new results for identification for sources. The first result is about block codes. In [Ahlswede and Cai, IEEE-IT, 52(9), 4198-4207, 2006] it is proven that the q-ary identification entropy HI,q(P) is a lower bound for the average number L(P,P) of expected checkings during the identification process. A necessary assumption for this proof is that the uniform distribution minimizes the symmetric running time $L_{\mathcal C}(P, P)$ for binary block codes $\mathcal C=\{0,1\}^k$. This assumption is proved in Sect. 2 not only for binary block codes but for any q-ary block code. The second result is about upper bounds for the worst-case running time. In [Ahlswede, Balkenhol and Kleinewchter, LNCS, 4123, 51-61, 2006] the authors proved in Theorem 3 that L(P)