Deriving the Continuity of Maximum-Entropy Basis Functions via Variational Analysis

  • Authors:
  • N. Sukumar;R. J.-B. Wets

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Optimization
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we prove the continuity of maximum-entropy basis functions using variational analysis techniques. The use of information-theoretic variational principles to derive basis functions is a recent development. In this setting, data approximation is viewed as an inductive inference problem, with the basis functions being synonymous with a discrete probability distribution, and the polynomial reproducing conditions acting as the linear constraints. For a set of distinct nodes $\{x^i\}_{i=1}^n$ in ${\mathbb R}^d$, the convex approximation of a function $u(x)$ is $u^h(x) = \sum_{i=1}^n p_i(x) u_i$, where $\{ p_i \}_{i=1}^n$ are nonnegative basis functions, and $u^h(x)$ must reproduce affine functions $\sum_{i=1}^n p_i(x) = 1$, $\sum_{i=1}^n p_i(x) x^i = x$. Given these constraints, we compute $p_i(x)$ by minimizing the relative entropy functional (Kullback-Leibler distance), $D(p\|m) = \sum_{i=1}^n p_i(x) \ln (p_i(x)/m_i(x))$, where $m_i(x)$ is a known prior weight function distribution. To prove the continuity of the basis functions, we appeal to the theory of epiconvergence.