Analyticity, convergence, and convergence rate of recursive maximum-likelihood estimation in hidden Markov models

  • Authors:
  • Vladislav B. Tadić

  • Affiliations:
  • Department of Mathematics, University of Bristol, Bristol, UK

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2010

Quantified Score

Hi-index 754.84

Visualization

Abstract

This paper considers the asymptotic properties of the recursive maximum-likelihood estimator for hidden Markov models. The paper is focused on the analytic properties of the asymptotic log-likelihood and on the point-convergence and convergence rate of the recursive maximum-likelihood estimator. Using the principle of analytic continuation, the analyticity of the asymptotic log-likelihood is shown for analytically parameterized hidden Markov models. Relying on this fact and some results from differential geometry (Lojasiewicz inequality), the almost sure point convergence of the recursive maximum-likelihood algorithm is demonstrated, and relatively tight bounds on the convergence rate are derived. As opposed to the existing result on the asymptotic behavior of maximum-likelihood estimation in hidden Markov models, the results of this paper are obtained without assuming that the log-likelihood function has an isolated maximum at which the Hessian is strictly negative definite.