Complete-data spaces and generalized EM algorithms

  • Authors:
  • J. A. Fessler;A. O. Hero

  • Affiliations:
  • Michigan Univ., Lansing, MI, USA;Michigan Univ., Lansing, MI, USA

  • Venue:
  • ICASSP '93 Proceedings of the Acoustics, Speech, and Signal Processing, 1993. ICASSP-93 Vol 4., 1993 IEEE International Conference on - Volume 04
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

Expectation-maximization (EM) algorithms have been applied extensively for computing maximum-likelihood and penalized-likelihood parameter estimates in signal processing applications. Intrinsic to each EM algorithm is a complete-data space (CDS)-a hypothetical set of random variables that is related to the parameters more naturally than the measurements are. The authors describe two generalizations of the EM paradigm: (i) allowing the relationship between the CDS and the measured data to be nondeterministic, and (ii) using a sequence of alternating complete-data spaces. These generalizations are motivated in part by the influence of the CDS on the convergence rate, a relationship that is formalized through a data-processing inequality for Fisher information. These concepts are applied to the problem of estimating superimposed signals in Gaussian noise, and it is shown that the new space alternating generalized EM algorithm converges significantly faster than the ordinary EM algorithm.