Elements of information theory
Elements of information theory
The Journal of Machine Learning Research
Diffusion Kernels on Statistical Manifolds
The Journal of Machine Learning Research
Text classification with kernels on the multinomial manifold
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
The Journal of Machine Learning Research
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Some inequalities for information divergence and related measures of discrimination
IEEE Transactions on Information Theory
A new metric for probability distributions
IEEE Transactions on Information Theory
Sided and symmetrized Bregman centroids
IEEE Transactions on Information Theory
Hi-index | 0.06 |
Positive definite kernels on probability measures have been recently applied in structured data classification problems. Some of these kernels are related to classic information theoretic quantities, such as mutual information and the Jensen-Shannon divergence. Meanwhile, driven by recent advances in Tsallis statistics, nonextensive generalizations of Shannon's information theory have been proposed. This paper bridges these two trends. We introduce the Jensen-Tsallis q-difference, a generalization of the Jensen-Shannon divergence. We then define a new family of nonextensive mutual information kernels, which allow weights to be assigned to their arguments, and which includes the Boolean, Jensen-Shannon, and linear kernels as particular cases. We illustrate the performance of these kernels on text categorization tasks.