Elements of information theory
Elements of information theory
The direct use of likelihood for significance testing
Statistics and Computing
A goodness-of-fit test for normality based on polynomial regression
Computational Statistics & Data Analysis
Multivariate maximum entropy identification, transformation, and dependence
Journal of Multivariate Analysis
Data mining based Bayesian networks for best classification
Computational Statistics & Data Analysis
Executives' perceived environmental uncertainty shortly after 9/11
Computational Statistics & Data Analysis
Hi-index | 0.03 |
The importance of predictors is characterized by the extent to which their use reduces uncertainty about predicting the response variable, namely their information importance. The uncertainty associated with a probability distribution is a concave function of the density such that its global maximum is a uniform distribution reflecting the most difficult prediction situation. Shannon entropy is used to operationalize the concept. For nonstochastic predictors, maximum entropy characterization of probability distributions provides measures of information importance. For stochastic predictors, the expected entropy difference gives measures of information importance, which are invariant under one-to-one transformations of the variables. Applications to various data types lead to familiar statistical quantities for various models, yet with the unified interpretation of uncertainty reduction. Bayesian inference procedures for the importance and relative importance of predictors are developed. Three examples show applications to normal regression, contingency table, and logit analyses.