Relative information: theories and applications
Relative information: theories and applications
Uncertainly measures of rough set prediction
Artificial Intelligence
Source Coding Theory
Uncertainty-Based Information: Elements of Generalized Information Theory
Uncertainty-Based Information: Elements of Generalized Information Theory
Machine Learning
Fractal Speech Processing
Image thresholding using Tsallis entropy
Pattern Recognition Letters
Theory and Applications of Fractional Differential Equations, Volume 204 (North-Holland Mathematics Studies)
Induction of multiple fuzzy decision trees based on rough set technique
Information Sciences: an International Journal
Knowledge structure, knowledge granulation and knowledge distance in a knowledge base
International Journal of Approximate Reasoning
Improving generalization of fuzzy IF-THEN rules by maximizing fuzzy entropy
IEEE Transactions on Fuzzy Systems
Computers & Mathematics with Applications
Cross-fuzzy entropy: A new method to test pattern synchrony of bivariate time series
Information Sciences: an International Journal
The Pólya information divergence
Information Sciences: an International Journal
Results on residual Rényi entropy of order statistics and record values
Information Sciences: an International Journal
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.07 |
By generalizing the basic functional equation f(xy)=f(x)+f(y) in the form f^@b(xy)=f^@b(x)+f^@b(y), @b1, one can derive a family of solutions which are exactly the inverse of the Mittag-Leffler function, referred to as Mittag-Leffler logarithm, or logarithm of fractional order. This result provides a new family of generalized informational entropies which are indexed by a parameter clearly related to fractals, via fractional calculus, and which is quite relevant in the presence in defect of observation. The relation with Shannon's entropy, Renyi's entropy and Tsallis' entropy is clarified, and it is shown that Tsallis' generalized logarithm has a significance in terms of fractional calculus. The case @b=2 looks like directly relevant to amplitude of probability in quantum mechanics, and provides an approach to the definition of ''amplitude of informational entropy''. One examines the kind of result one can so obtain in applying the maximum entropy principle. In the presence of uncertain definition (or fuzzy definition) the Mittag-Leffler function would be more relevant than the Gaussian normal law. To some extent, this new formulation could be fully supported by the derivation of a new family of fractional Fisher information.