Distance measures for signal processing and pattern recognition
Signal Processing
Some new information measures for fuzzy sets
Information Sciences: an International Journal
Divergence statistics based on entropy functions and stratified sampling
Information Sciences—Informatics and Computer Science: An International Journal
Renyi's entropy as an index of diversity in simple-stage cluster sampling
Information Sciences: an International Journal
Formulas for Rényi information and related measures for univariate distributions
Information Sciences: an International Journal
Expressions for Rényi and Shannon entropies for bivariate distributions
Information Sciences—Informatics and Computer Science: An International Journal
Image matching using alpha-entropy measures and entropic graphs
Signal Processing - Special section on content-based image and video retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Rényi information measure for a used item
Information Sciences: an International Journal
A generalized divergence measure for robust image registration
IEEE Transactions on Signal Processing
Measuring time-frequency information content using the Renyi entropies
IEEE Transactions on Information Theory
Generalized cutoff rates and Renyi's information measures
IEEE Transactions on Information Theory
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
Rényi entropy rate for Gaussian processes
Information Sciences: an International Journal
The Pólya information divergence
Information Sciences: an International Journal
Results on residual Rényi entropy of order statistics and record values
Information Sciences: an International Journal
Divergence statistics for testing uniform association in cross-classifications
Information Sciences: an International Journal
Strong fuzzy c-means in medical image data analysis
Journal of Systems and Software
On uncertainty and information properties of ranked set samples
Information Sciences: an International Journal
Hi-index | 0.07 |
We consider the maximum entropy problems associated with Renyi Q-entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are derived and characterized. The Renyi entropy of the optimum distributions can be viewed as a function of the constraint. This defines two families of entropy functionals in the space of possible expected values. General properties of these functionals, including nonnegativity, minimum, convexity, are documented. Their relationships as well as numerical aspects are also discussed. Finally, we work out some specific cases for the reference measure Q(x) and recover in a limit case some well-known entropies.