Formulas for Rényi information and related measures for univariate distributions
Information Sciences: an International Journal
Expressions for Rényi and Shannon entropies for bivariate distributions
Information Sciences—Informatics and Computer Science: An International Journal
Measuring stochastic dependence using φ -divergence
Journal of Multivariate Analysis
Maximum entropy characterizations of the multivariate Liouville distributions
Journal of Multivariate Analysis
Multivariate dynamic information
Journal of Multivariate Analysis
Multivariate maximum entropy identification, transformation, and dependence
Journal of Multivariate Analysis
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Efficient optimization of information-theoretic exploration in SLAM
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
Logarithmic Sobolev inequalities for information measures
IEEE Transactions on Information Theory
Differential entropy of multivariate neural spike trains
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Objective priors from maximum entropy in data classification
Information Fusion
Hi-index | 754.90 |
Analytical formulas for the entropy and the mutual information of multivariate continuous probability distributions are presented