Bayesian bin distribution inference and mutual information

  • Authors:
  • D. Endres;P. Foldiak

  • Affiliations:
  • Sch. of Psychol., Univ. of St. Andrews, UK;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2005

Quantified Score

Hi-index 754.84

Visualization

Abstract

We present an exact Bayesian treatment of a simple, yet sufficiently general probability distribution model. We consider piecewise-constant distributions P(X) with uniform (second-order) prior over location of discontinuity points and assigned chances. The predictive distribution and the model complexity can be determined completely from the data in a computational time that is linear in the number of degrees of freedom and quadratic in the number of possible values of X. Furthermore, exact values of the expectations of entropies and their variances can be computed with polynomial effort. The expectation of the mutual information becomes thus available, too, and a strict upper bound on its variance. The resulting algorithm is particularly useful in experimental research areas where the number of available samples is severely limited (e.g., neurophysiology). Estimates on a simulated data set provide more accurate results than using a previously proposed method.