On maximum entropy characterization of Pearson's type II and VII multivariate distributions
Journal of Multivariate Analysis
Average Case Analysis of Algorithms on Sequences
Average Case Analysis of Algorithms on Sequences
Entity-Relationship Modeling: Foundations of Database Technology
Entity-Relationship Modeling: Foundations of Database Technology
Rényi Extrapolation of Shannon Entropy
Open Systems & Information Dynamics
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
On Mardia's and Song's measures of kurtosis in elliptical distributions
Journal of Multivariate Analysis
A suboptimal lossy data compression based on approximate pattern matching
IEEE Transactions on Information Theory
Statistical inference for rényi entropy functionals
Conceptual Modelling and Its Theoretical Foundations
Hi-index | 0.00 |
Entropy and its various generalizations are widely used in mathematical statistics, communication theory, physical and computer sciences for characterizing the amount of information in a probability distribution. We consider estimators of the quadratic Renyi entropy and some related characteristics of discrete and continuous probability distributions based on the number of coincident (or @e-close) vector observations in the corresponding independent and identically distributed sample. We show some asymptotic properties of these estimators (e.g., consistency and asymptotic normality). These estimators can be used in various problems in mathematical statistics and computer science (e.g., distribution identification problems, average case analysis for random databases, approximate pattern matching in bioinformatics, cryptography).