On maximum entropy characterization of Pearson's type II and VII multivariate distributions
Journal of Multivariate Analysis
An estimator of the mutual information based on a criterion for independence
Computational Statistics & Data Analysis
Entropy expressions for multivariate continuous distributions
IEEE Transactions on Information Theory
Multivariate maximum entropy identification, transformation, and dependence
Journal of Multivariate Analysis
Models based on partial information about survival and hazard gradient
Probability in the Engineering and Informational Sciences
Hi-index | 0.00 |
This paper develops measures of information for multivariate distributions when their supports are truncated progressively. The focus is on the joint, marginal, and conditional entropies, and the mutual information for residual life distributions where the support is truncated at the current ages of the components of a system. The current ages of the components induce a joint dynamic into the residual life information measures. Our study of dynamic information measures includes several important bivariate and multivariate lifetime models. We derive entropy expressions for a few models, including Marshall-Olkin bivariate exponential. However, in general, study of the dynamics of residual information measures requires computational techniques or analytical results. A bivariate gamma example illustrates study of dynamic information via numerical integration. The analytical results facilitate studying other distributions. The results are on monotonicity of the residual entropy of a system and on transformations that preserve the monotonicity and the order of entropies between two systems. The results also include a new entropy characterization of the joint distribution of independent exponential random variables.