SIGMOD '95 Proceedings of the 1995 ACM SIGMOD international conference on Management of data
Non-linear dimensionality reduction techniques for unsupervised feature extraction
Pattern Recognition Letters
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
SPARTAN: a model-based semantic compression system for massive data tables
SIGMOD '01 Proceedings of the 2001 ACM SIGMOD international conference on Management of data
Robust space transformations for distance-based operations
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Non-Linear Dimensionality Reduction
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Local Dimensionality Reduction: A New Approach to Indexing High Dimensional Spaces
VLDB '00 Proceedings of the 26th International Conference on Very Large Data Bases
Estimating the Selectivity of Spatial Queries Using the `Correlation' Fractal Dimension
VLDB '95 Proceedings of the 21th International Conference on Very Large Data Bases
A distributed database for bio-molecular images
ACM SIGMOD Record
Relationship preserving feature selection for unlabelled clinical trials time-series
Proceedings of the First ACM International Conference on Bioinformatics and Computational Biology
Hi-index | 0.00 |
Existing axis scaling and dimensionality methods focus on preserving structure, usually determined via the Euclidean distance. In other words, they inherently assume that the Euclidean distance is already correct. We instead propose a novel nonlinear approach driven by an information-theoretic viewpoint, which we show is also strongly linked to intrinsic dimensionality, or degrees of freedom; and uniformity. Nonlinear transformations based on common probability distributions, combined with information-driven selection, simultaneously reduce the number of dimensions required and increase the value of those we retain. Experiments on real data confirm that this approach reveals correlations, finds novel attributes, and scales well.