Attribute-Level Neighbor Hierarchy Construction Using Evolved Pattern-Based Knowledge Induction
IEEE Transactions on Knowledge and Data Engineering
An introduction to symbolic data analysis and the SODAS software
Intelligent Data Analysis
On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences
IEEE Transactions on Information Theory
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Hi-index | 754.90 |
This note presents new inequalities relating different divergence measures in the family of “convex likelihood-ratio expectation” measures of Csiszar (1967), Ali and Silvey (1966), and especially in the single-parameter family of “AM-GM” divergence measures. The most prominent result is that θ2 ⩽¼J, where θ is the Bhattacharyya angle of divergence (a true distance metric), and J is the symmetric cross-entropy. A pair of “log Γ” divergences is also introduced and related to the cross-entropies I and J