Principles and practice of information theory
Principles and practice of information theory
Elements of information theory
Elements of information theory
Convergence properties of functional estimates for discrete distributions
Random Structures & Algorithms - Special issue on analysis of algorithms dedicated to Don Knuth on the occasion of his (100)8th birthday
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
Convergence properties of Shannon entropy are studied. In the differential setting, it is known that weak convergence of probability measures (convergence in distribution) is not sufficient for convergence of the associated differential entropies. In that direction, an interesting example is introduced and discussed in light of new general results provided here for the desired differential entropy convergence, which take into account both compactly and uncompactly supported densities. Convergence of differential entropy is also characterized in terms of the Kullback-Liebler discriminant for densities with fairly general supports, and it is shown that convergence in variation of probability measures guarantees such convergence under an appropriate boundedness condition on the densities involved. Results for the discrete setting are also provided, allowing for infinitely supported probability measures, by taking advantage of the equivalence between weak convergence and convergence in variation in that setting.