Information importance of predictors: Concept, measures, Bayesian inference, and applications

  • Authors:
  • J. J. Retzer;E. S. Soofi;R. Soyer

  • Affiliations:
  • Maritz Research, 1815 S. Meyers Road, Suite 600, Oak brook Terrace, IL 60181, USA;Sheldon B. Lubar School of Business, University of Wisconsin-Milwaukee, P.O. Box 742, Milwaukee, WI 53201, USA and Center for Research on International Economics, University of Wisconsin-Milwaukee ...;Department of Decision Sciences, George Washington University, Washington, DC 20052, USA

  • Venue:
  • Computational Statistics & Data Analysis
  • Year:
  • 2009

Quantified Score

Hi-index 0.03

Visualization

Abstract

The importance of predictors is characterized by the extent to which their use reduces uncertainty about predicting the response variable, namely their information importance. The uncertainty associated with a probability distribution is a concave function of the density such that its global maximum is a uniform distribution reflecting the most difficult prediction situation. Shannon entropy is used to operationalize the concept. For nonstochastic predictors, maximum entropy characterization of probability distributions provides measures of information importance. For stochastic predictors, the expected entropy difference gives measures of information importance, which are invariant under one-to-one transformations of the variables. Applications to various data types lead to familiar statistical quantities for various models, yet with the unified interpretation of uncertainty reduction. Bayesian inference procedures for the importance and relative importance of predictors are developed. Three examples show applications to normal regression, contingency table, and logit analyses.