Rényi Extrapolation of Shannon Entropy
Open Systems & Information Dynamics
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Variance Reduction Techniques for Gradient Estimates in Reinforcement Learning
The Journal of Machine Learning Research
Streaming and sublinear approximation of entropy and information distances
SODA '06 Proceedings of the seventeenth annual ACM-SIAM symposium on Discrete algorithm
Bounds for the Loss in Probability of Correct Classification Under Model Based Approximation
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of the 25th international conference on Machine learning
Learning decision trees with taxonomy of propositionalized attributes
Pattern Recognition
Taxonomy of nominal type histogram distance measures
MATH'08 Proceedings of the American Conference on Applied Mathematics
Kernel-Based Grouping of Histogram Data
ECML '07 Proceedings of the 18th European conference on Machine Learning
Action Recognition Using a Bio-Inspired Feedforward Spiking Network
International Journal of Computer Vision
Sublinear estimation of entropy and information distances
ACM Transactions on Algorithms (TALG)
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Very accurate posterior approximations based on finite mixtures of the hyperparameters conditionals
Computational Statistics & Data Analysis
Identifying the underlying hierarchical structure of clusters in cluster analysis
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences
IEEE Transactions on Information Theory
Estimating divergence functionals and the likelihood ratio by convex risk minimization
IEEE Transactions on Information Theory
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Theory of a probabilistic-dependence measure of dissimilarity among multiple clusters
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Classification of chromosome sequences with entropy kernel and LKPLS algorithm
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
Efficient Nearest-Neighbor Search in the Probability Simplex
Proceedings of the 2013 Conference on the Theory of Information Retrieval
Hi-index | 754.96 |
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory, and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence, we obtain certain improvements of the well-known Pinsker inequality. Our study depends on two measures of discrimination, called capacitory discrimination and triangular discrimination. The discussion contains references to related research and comparison with other measures of discrimination, e.g., Ali-Silvey-Csiszar (1996, 1966) divergences and, in particular, the Hellinger distance