Minimum-entropy estimation in semi-parametric models

  • Authors:
  • Eric Wolsztynski;Eric Thierry;Luc Pronzato

  • Affiliations:
  • Laboratoire I3S, Université de Nice-Sophia Antipolis, Sophia Antipolis, France;Laboratoire I3S, Université de Nice-Sophia Antipolis Sophia Antipolis, France;Laboratoire I3S, Université de Nice-Sophia Antipolis, Sophia Antipolis, France

  • Venue:
  • Signal Processing - Special issue: Information theoretic signal processing
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In regression problems where the density f of the errors is not known, maximum likelihood is unapplicable, and the use of alternative techniques like least squares or robust M-estimation generally implies inefficient estimation of the parameters. The search for adaptive estimators, that is, estimators that remain asymptotically efficient independently of the knowledge of f, has received a lot of attention, see in particular (Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, 1956, pp. 187; Ann. Stat. 3(2) (1975) 267; Ann. Stat. 10 (1982) 647) and the review paper (Econometric Rev. 3(2) (1984) 145). The paper considers a minimum-entropy parametric estimator that minimizes an estimate of the entropy of the distribution of the residuals. A first construction connects the method with the Stone-Bickel approach, where the estimation is decomposed into two steps. Then we consider a direct approach that does not involve any preliminary √n-consistent estimator. Some results are given that illustrate the good performance of minimum-entropy estimation for reasonable sample sizes when compared to standard methods, in particular concerning robustness in the presence of outliers.