Elements of information theory
Elements of information theory
Machine Learning
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Divergence estimation for multidimensional densities via k-nearest-neighbor distances
IEEE Transactions on Information Theory
Information estimators for weighted observations
Neural Networks
Hi-index | 0.00 |
The Shannon information content is a fundamental quantity and it is of great importance to estimate it from observed dataset in the field of statistics, information theory, and machine learning. In this study, an estimator for the information content using a given set of weighted data is proposed. The empirical data distribution varies depending on the weight. The notable features of the proposed estimator are its computational efficiency and its ability to deal with weighted data. The proposed estimator is extended in order to estimate cross entropy, entropy and KL divergence with weighted data. Then, the estimators are applied to classification with one-class samples, and distribution preserving data compression problems.