Parametric estimation and tests through divergences and the duality technique
Journal of Multivariate Analysis
On the convexity of some divergence measures based on entropy functions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The properties of minimum $$K_{\phi }$$ K 驴 -divergence estimators for parametric multinomial populations are well-known when the assumed parametric model is true, namely, they are consistent and asymptotically normally distributed. Here we study these properties when the parametric model is not assumed to be correctly specified. Under certain conditions, these estimators are shown to converge to a well-defined limit and, suitably normalized, they are also asymptotically normal. Two applications of the results obtained are reported. First, two consistent bootstrap estimators of the null distribution of the test statistics in a certain class of goodness-of-fit tests are proposed and studied. Second, two methods for the model selection test problem based on $$K_{\phi }$$ K 驴 -divergence type statistics are proposed and studied. Both applications are illustrated with numerical examples.