Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Using Discriminant Eigenfeatures for Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
A comparison of generalized linear discriminant analysis algorithms
Pattern Recognition
Journal of Cognitive Neuroscience
Hi-index | 0.10 |
From the view of classification, linear discriminant analysis (LDA) is a proper dimensionality reduction method which finds an optimal linear transformation that maximizes the class separability. However it is difficult to apply LDA in under sampled problems where the number of data samples is smaller than the dimensionality of data space, due to the singularity of scatter matrices caused by high-dimensionality. In order to make LDA applicable, we propose a new dimensionality reduction algorithm called discriminant multidimensional mapping (DMM), which combines the advantages of multidimensional scaling (MDS) and LDA. DMM is effective for small sample datasets with high-dimensionality. Its superiority is given from theoretical point of view. Then we extend DMM for large datasets and datasets with non-linear manifold respectively, and get two algorithms: landmark DMM (LDMM) and geodesic-metric discriminant mapping (GDM). The performances of these algorithms are also shown by preliminary numerical experiments.