Intrinsic Dimensionality Estimation With Optimally Topology Preserving Maps
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Optimization
Estimating the Intrinsic Dimension of Data with a Fractal-Based Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Whitney Reduction Network: A Method for Computing Autoassociative Graphs
Neural Computation
Manifold-adaptive dimension estimation
Proceedings of the 24th international conference on Machine learning
An Algorithm for Finding Intrinsic Dimensionality of Data
IEEE Transactions on Computers
Journal of Cognitive Neuroscience
Intrinsic dimension estimation of manifolds by incising balls
Pattern Recognition
On local intrinsic dimension estimation and its applications
IEEE Transactions on Signal Processing
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Face recognition using the nearest feature line method
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
We consider the problems of classification and intrinsic dimension estimation on image data. A new subspace based classifier is proposed for supervised classification or intrinsic dimension estimation. The distribution of the data in each class is modeled by a union of a finite number of affine subspaces of the feature space. The affine subspaces have a common dimension, which is assumed to be much less than the dimension of the feature space. The subspaces are found using regression based on the @?"0-norm. The proposed method is a generalisation of classical NN (Nearest Neighbor), NFL (Nearest Feature Line) classifiers and has a close relationship to NS (Nearest Subspace) classifier. The proposed classifier with an accurately estimated dimension parameter generally outperforms its competitors in terms of classification accuracy. We also propose a fast version of the classifier using a neighborhood representation to reduce its computational complexity. Experiments on publicly available datasets corroborate these claims.