Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
GTM: the generative topographic mapping
Neural Computation
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Mapping a manifold of perceptual observations
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
ACM Computing Surveys (CSUR)
Vector-Valued Image Regularization with PDEs: A Common Framework for Different Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
Image quality assessment: from error visibility to structural similarity
IEEE Transactions on Image Processing
Hi-index | 0.00 |
Dimensionality reduction techniques are designed to exploit the fact that most high-dimensional datasets from the real world do not uniformly fill the hyperspaces in which they are represented but instead their distributions usually concentrate to nonlinear manifolds of lower intrinsic dimensions. However, when these techniques are applied directly to the initial degraded and noisy data, the assumptions on the possible statistical separation of real world classes do not, in the general case, hold. In this paper, we argue that scale space filtering, by denoising and simplifying effectively the initial dataset, ameliorate the way the properties of our observations are been encoded, strengthening, thus, the assumptions on the possible statistical separation of real world classes. Experimental results on real hyperspectral datasets demonstrate that appropriate vector-valued scale space filtering significantly contributes to the intrinsic dimension estimation and dimensionality reduction of high dimensional datasets.