Elements of information theory
Elements of information theory
A new metric for probability distributions
IEEE Transactions on Information Theory
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Hi-index | 0.00 |
We prove that the Variational distance (and its positive multiples) is the only f-divergence that satisfies both the identity of indiscernibles and the triangle inequality. Therefore it is the unique f-divergence which serves as a metric. This point is interpreted as a fundamental confliction of the convexity for f(x) with the metric properties for its associated f-divergence. Therefore, we relax the convexity of f(x) and replace it with other constraints to create new metrics.