A constrained EM algorithm for univariate normal mixtures
Journal of Statistical Computation and Simulation
Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Modern Applied Statistics with S
Modern Applied Statistics with S
Minimum disparity computation via the iteratively reweighted least integrated squares algorithms
Computational Statistics & Data Analysis
Fisher scoring: An interpolation family and its Monte Carlo implementations
Computational Statistics & Data Analysis
Hi-index | 0.03 |
The Fisher scoring and Gauss-Newton methods are two known methods for maximum likelihood computation. This paper provides a generalization for each method in a unified manner so that they can be used for some difficult maximum likelihood computation, when, for example, there exist constraints on the parameters. A generalized method does not use directly the Newton-type iteration formulas of these methods, but, instead, uses the corresponding quadratic functions transformed from them. It proceeds by repeatedly approximating the log-likelihood function with the quadratic functions in the neighborhoods of the current iterates and optimizing each quadratic function within the parameter space. It is shown that each quadratic function has a weighted linear regression formulation, which can be conveniently solved. This generalization also extends the applicability of the Fisher scoring method to situations when the expected Fisher information matrices are unavailable in closed form. Fast computation can generally be anticipated, owing to their small rates of convergence and a rapid solution of each linear regression problem. While the generalized Gauss-Newton method may sometimes suffer for the so-called large residual problem, the generalized Fisher scoring method has performed consistently well in the numerical experiments we conducted.