Promoter Recognition for E. coli DNA Segments by Independent Component Analysis
CSB '04 Proceedings of the 2004 IEEE Computational Systems Bioinformatics Conference
Integration of Stochastic Models by Minimizing α-Divergence
Neural Computation
Information Geometry and Its Applications: Convex Function and Dually Flat Manifold
Emerging Trends in Visual Computing
α-divergence is unique, belonging to both f-divergence and Bregman divergence classes
IEEE Transactions on Information Theory
Independent component analysis minimizing convex divergence
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Hi-index | 754.90 |
A new likelihood maximization algorithm called the α-EM algorithm (α-expectation-maximization algorithm) is presented. This algorithm outperforms the traditional or logarithmic EM algorithm in terms of convergence speed for an appropriate range of the design parameter α. The log-EM algorithm is a special case corresponding to α=-1. The main idea behind the α-EM algorithm is to search for an effective surrogate function or a minorizer for the maximization of the observed data's likelihood ratio. The surrogate function adopted in this paper is based upon the α-logarithm which is related to the convex divergence. The convergence speed of the α-EM algorithm is theoretically analyzed through α-dependent update matrices and illustrated by numerical simulations. Finally, general guidelines for using the α-logarithmic methods are given. The choice of alternative surrogate functions is also discussed.