The nature of statistical learning theory
The nature of statistical learning theory
Mapping a manifold of perceptual observations
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
An introduction to nonlinear dimensionality reduction by maximum variance unfolding
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Global and local choice of the number of nearest neighbors in locally linear embedding
Pattern Recognition Letters
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Generalized correlation function: definition, properties, and application to blind equalization
IEEE Transactions on Signal Processing - Part I
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.01 |
Linear dimensionality reduction (DR) is a widely used technique in pattern recognition to control the dimensionality of input data, but it does neither preserve discriminability nor is capable of discovering nonlinear degrees of freedom present in natural observations. More recently, nonlinear dimensionality reduction (NLDR) algorithms have been developed taking advantage of the fact that data may lie on an embedded nonlinear manifold within an high dimensional feature space. Nevertheless, if the input data is corrupted (noise and outliers), most of nonlinear techniques specially Locally Linear Embedding (LLE) do not produce suitable embedding results. The culprit is the Euclidean distance (cost function in LLE) that does not correctly represent the dissimilarity between objects, increasing the error because of corrupted observations. In this work, the Euclidean distance is replaced by the correntropy induced metric (CIM), which is particularly useful to handle outliers. Moreover, we also extend NLDR to handle manifold divided into separated groups or several manifolds at the same time by employing class label information (CLI), yielding a discriminative representation of data on low dimensional space. Correntropy LLE+CLI approach is tested for visualization and classification on noisy artificial and real-world data sets. The obtained results confirm the capabilities of the discussed approach reducing the negative effects of outliers and noise on the low dimensional space. Besides, it outperforms the other NLDR techniques, in terms of classification accuracy.