Information geometry of U-Boost and Bregman divergence
Neural Computation
Robust Gaussian graphical modeling
Journal of Multivariate Analysis
Divergence based online learning in vector quantization
ICAISC'10 Proceedings of the 10th international conference on Artificial intelligence and soft computing: Part I
Divergence-based vector quantization
Neural Computation
The mathematics of divergence based online learning in vector quantization
ANNPR'10 Proceedings of the 4th IAPR TC3 conference on Artificial Neural Networks in Pattern Recognition
Density estimation with minimization of U-divergence
Machine Learning
Robust active learning for linear regression via density power divergence
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Active learning for noisy oracle via density power divergence
Neural Networks
Spontaneous clustering via minimum gamma-divergence
Neural Computation
Hi-index | 0.00 |
In this paper we consider robust parameter estimation based on a certain cross entropy and divergence. The robust estimate is defined as the minimizer of the empirically estimated cross entropy. It is shown that the robust estimate can be regarded as a kind of projection from the viewpoint of a Pythagorean relation based on the divergence. This property implies that the bias caused by outliers can become sufficiently small even in the case of heavy contamination. It is seen that the asymptotic variance of the robust estimator is naturally overweighted in proportion to the ratio of contamination. One may surmise that another form of cross entropy can present the same behavior as that discussed above. It can be proved under some conditions that no cross entropy can present the same behavior except for the cross entropy considered here and its monotone transformation.