A discrete mixture-based kernel for SVMs: Application to spam and image categorization
Information Processing and Management: an International Journal
A novel weighted least squares PET image reconstruction method using adaptive variable index sets
Digital Signal Processing
SAGE based joint timing-frequency offsets and channel estimation in distributed MIMO systems
Computer Communications
Accelerating the quadratic lower-bound algorithm via optimizing the shrinkage parameter
Computational Statistics & Data Analysis
Cross-ambiguity function domain multipath channel parameter estimation
Digital Signal Processing
Forecasting in the NBA and other team sports: Network effects in action
ACM Transactions on Knowledge Discovery from Data (TKDD)
Hardware Simulator Design for MIMO Propagation Channel on Shipboard at 2.2 GHz
Wireless Personal Communications: An International Journal
SAGE Algorithm for Semi-Blind Channel Estimation and Symbol Detection for STBC MIMO OFDM Systems
Wireless Personal Communications: An International Journal
From RSSI to CSI: Indoor localization via channel response
ACM Computing Surveys (CSUR)
Hi-index | 35.68 |
The expectation-maximization (EM) method can facilitate maximizing likelihood functions that arise in statistical estimation problems. In the classical EM paradigm, one iteratively maximizes the conditional log-likelihood of a single unobservable complete data space, rather than maximizing the intractable likelihood function for the measured or incomplete data. EM algorithms update all parameters simultaneously, which has two drawbacks: 1) slow convergence, and 2) difficult maximization steps due to coupling when smoothness penalties are used. The paper describes the space-alternating generalized EM (SAGE) method, which updates the parameters sequentially by alternating between several small hidden-data spaces defined by the algorithm designer. The authors prove that the sequence of estimates monotonically increases the penalized-likelihood objective, derive asymptotic convergence rates, and provide sufficient conditions for monotone convergence in norm. Two signal processing applications illustrate the method: estimation of superimposed signals in Gaussian noise, and image reconstruction from Poisson measurements. In both applications, the SAGE algorithms easily accommodate smoothness penalties and converge faster than the EM algorithms