Tsallis differential entropy and divergences derived from the generalized Shannon-Khinchin axioms
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Generalised entropy maximisation and queues with bursty and/or heavy tails
Network performance engineering
Hi-index | 754.84 |
In order to theoretically explain the ubiquitous existence of power-law behavior such as chaos and fractals in nature, Tsallis entropy has been successfully applied to the generalization of the traditional Boltzmann-Gibbs statistics, the fundamental information measure of which is Shannon entropy. Tsallis entropy Sq is a one-parameter generalization of Shannon entropy S1 in the sense that limq→1Sq=S1. The generalized statistics using Tsallis entropy are referred to as Tsallis statistics. In order to present the law of error in Tsallis statistics as a generalization of Gauss' law of error and prove it mathematically, we apply the new multiplication operation determined by q-logarithm and q-exponential, the fundamental functions in Tsallis statistics, to the definition of the likelihood function in Gauss' law of error. The present maximum-likelihood principle (MLP) leads us to determine the so-called q-Gaussian distribution, which coincides with one of the Tsallis distributions derived from the maximum entropy principle for Tsallis entropy under the second moment constraint.