Intrinsic Dimension Estimation of Data: An Approach Based on Grassberger–Procaccia's Algorithm
Neural Processing Letters
Theoretical Computer Science
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
In this paper we show that the correlation integral can be decomposed into functions each related to a particular point of data space. For these functions, one can use similar polynomial approximations as used in the correlation integral. The essential difference is that the value of the exponent, which would correspond to the correlation dimension, differs in accordance to the position of the point in question. Moreover, we show that the multiplicative constant represents the probability density estimation at that point. This finding is used for the construction of a classifier. Tests with some data sets from the Machine Learning Repository show that this classifier can be very effective.