Elements of information theory
Elements of information theory
On maximum entropy characterization of Pearson's type II and VII multivariate distributions
Journal of Multivariate Analysis
Expressions for Rényi and Shannon entropies for bivariate distributions
Information Sciences—Informatics and Computer Science: An International Journal
Multivariate dynamic information
Journal of Multivariate Analysis
Entropy expressions for multivariate continuous distributions
IEEE Transactions on Information Theory
Information importance of predictors: Concept, measures, Bayesian inference, and applications
Computational Statistics & Data Analysis
Models based on partial information about survival and hazard gradient
Probability in the Engineering and Informational Sciences
Hi-index | 0.00 |
This paper shows that multivariate distributions can be characterized as maximum entropy (ME) models based on the well-known general representation of density function of the ME distribution subject to moment constraints. In this approach, the problem of ME characterization simplifies to the problem of representing the multivariate density in the ME form, hence there is no need for case-by-case proofs by calculus of variations or other methods. The main vehicle for this ME characterization approach is the information distinguishability relationship, which extends to the multivariate case. Results are also formulated that encapsulate implications of the multiplication rule of probability and the entropy transformation formula for ME characterization. The dependence structure of multivariate ME distribution in terms of the moments and the support of distribution is studied. The relationships of ME distributions with the exponential family and with bivariate distributions having exponential family conditionals are explored. Applications include new ME characterizations of many bivariate distributions, including some singular distributions.