Minimum $$K_{\phi }$$K-divergence estimators for multinomial models and applications

  • Authors:
  • M. D. Jiménez-Gamero;R. Pino-Mejías;A. Rufián-Lizana

  • Affiliations:
  • Dpto. de Estadística e Investigación Operativa, Universidad de Sevilla, Seville, Spain;Dpto. de Estadística e Investigación Operativa, Universidad de Sevilla, Seville, Spain;Dpto. de Estadística e Investigación Operativa, Universidad de Sevilla, Seville, Spain

  • Venue:
  • Computational Statistics
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

The properties of minimum $$K_{\phi }$$ K 驴 -divergence estimators for parametric multinomial populations are well-known when the assumed parametric model is true, namely, they are consistent and asymptotically normally distributed. Here we study these properties when the parametric model is not assumed to be correctly specified. Under certain conditions, these estimators are shown to converge to a well-defined limit and, suitably normalized, they are also asymptotically normal. Two applications of the results obtained are reported. First, two consistent bootstrap estimators of the null distribution of the test statistics in a certain class of goodness-of-fit tests are proposed and studied. Second, two methods for the model selection test problem based on $$K_{\phi }$$ K 驴 -divergence type statistics are proposed and studied. Both applications are illustrated with numerical examples.