Elements of information theory
Elements of information theory
Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
Information Sciences: an International Journal
IEEE Transactions on Information Theory
Law of error in Tsallis statistics
IEEE Transactions on Information Theory
On uniqueness Theorems for Tsallis entropy and Tsallis relative entropy
IEEE Transactions on Information Theory
An Interpretation of Identification Entropy
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In discrete systems, Shannon entropy is well known to be characterized by the Shannon-Khinchin axioms. Recently, this set of axioms was generalized for Tsallis entropy, one-parameter generalization of Shannon entropy. In continuos systems, Shannon differential entropy has been introduced as a natural extension of the above Shannon entropy without using an axiomatic approach. We derive the generalized entropy function as a solution of the functional equation determined by the generalized Shannon additivity, one of the most important axiom of the generalized Shannon-Khinchin axioms for Tsallis entropy. This generalized entropy function naturally introduces Tsallis differential entropy and two Tsallis divergences. In particular, one (Csiszár type) of the divergences has almost the same form as the α-divergence in information geometry and the other the Bregman type divergence. Our results reveal that the generalized Shannon additivity representing a branch structure of a rooted tree plays an essential role in the determination of these entropies.