An Optimal Transformation for Discriminant and Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
A note on the orthonormal discriminant vector method for feature extraction
Pattern Recognition
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
An Optimal Set of Discriminant Vectors
IEEE Transactions on Computers
Hi-index | 0.00 |
In this paper, we propose a new discrminant analysis based on datawise formulation of scatter matrices to deal with the data of non-normal distribution. Starting from original LDA, datawise formulation of scatter matrices is derived and its meaning is clarified. Based on this formulation, a new feature extraction algorithm is presented. In this formulation, assumption on distribution of data is no more necessary, so appropriate feature space can be found from the data whose distribution is non-normal, as well as multimodally normal. Limitation on the feature dimension also can be removed, and by replacing the inverse matrix of within-class scatter matrix with especially assigned weights, computational problems originating from matrix inversion of within-scatter matrix can be fundamentally avoided. As a result, good feature space for classification task can be found without the problems of LDA. Performance of this algorithm has been evaluated by using feature for real classification tasks.