On the convexity of higher order Jensen differences based on entropy functions (Corresp.)

  • Authors:
  • J. Burbea;C. Rao

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

In an earlier work, the authors introduced a divergence measure, called the first-order Jensen difference, or in shortcal j-divergence, which is based on entropy functions of degreealpha. This provided a generalization of the measure of mutual information based on Shannon's entropy (corresponding toalpha = 1). It was shown that the first-ordercal j-divergence is a convex function only when a is restricted to some range. We define higher order Jensen differences and show that they are convex functions only when the underlying entropy function is of degree two. A statistical application requiring the convexity of higher order Jensen differences is indicated.