Visualizing changes in the structure of data for exploratory feature selection
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Analysis of soft handover measurements in 3G network
Proceedings of the 9th ACM international symposium on Modeling analysis and simulation of wireless and mobile systems
Hi-index | 0.00 |
We introduce an algorithm for learning a local metric to a continuous input space that measures distances in terms of relevance to the processing task. The relevance is defined as local changes in discrete auxiliary information, which may be for example the class of the data items, an index of performance, or a contextual input. A set of neurons first learns representations that maximize the mutual information between their outputs and the random variable representing the auxiliary information. The implicit knowledge gained about relevance is then transformed into a new metric of the input space that measures the change in the auxiliary information in the sense of local approximations to the Kullback-Leibler divergence. The new metric can be used in further processing by other algorithms. It is especially useful in data analysis applications since the distances can be interpreted in terms of the local relevance of the original variables.