Information geometry of U-Boost and Bregman divergence
Neural Computation
Information theoretic novelty detection
Pattern Recognition
WSEAS TRANSACTIONS on SYSTEMS
Biometric fusion: does modeling correlation really matter?
BTAS'09 Proceedings of the 3rd IEEE international conference on Biometrics: Theory, applications and systems
Condition diagnosis method based on statistic features and information divergence
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 2
A method for quantitative fault diagnosability analysis of stochastic linear descriptor models
Automatica (Journal of IFAC)
Hi-index | 0.00 |
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this connection gives another statistical interpretation of the Kullback-Leibler divergence in terms of the loss of power of the likelihood ratio test when the wrong distribution is used for one of the hypotheses. In this interpretation, the standard non-negativity property of the Kullback-Leibler divergence is essentially a restatement of the optimal property of likelihood ratios established by the Neyman-Pearson lemma. The asymmetry of Kullback-Leibler divergence is overviewed in information geometry.