A modular system of algorithms for unconstrained minimization
ACM Transactions on Mathematical Software (TOMS)
Asymptotic methods in statistical theory
Asymptotic methods in statistical theory
Robust empirical bayes analyses of event rates
Technometrics
Some inequalities for information divergence and related measures of discrimination
IEEE Transactions on Information Theory
On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences
IEEE Transactions on Information Theory
Hi-index | 0.09 |
Consider a posterior density @p(@l,@f) such that both @p(@l|@f) and @p(@f|@l) are known. We propose to approximate @p(@l,@f) by @p(@l|@f)@p@^(@f), where @p@^(@f) is a finite mixture of the posterior conditionals @p(@f|@l). The weights and components of the mixture are chosen to minimize an approximate f-divergence between the approximate and the actual posterior. These approximate divergences are computed through an importance sampling idea using a simulated sample from the same finite mixture approximations. For the special case of the @g^2 or Harmonic divergences, once the minimum approximate divergences have been obtained, they can be plugged into total variation type inequalities to obtain precision limits for the corresponding approximations of posterior expectations of interest. When the algorithm can be used-namely, when both full conditionals @p(@l|@f) and @p(@f|@l) are known, it requires little computational, programming and diagnosing effort. Moreover, we present several examples which show that the approximations produced are extremely accurate, even when a small number of components are included in the mixture approximation.