Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
The Journal of Machine Learning Research
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Local Fisher discriminant analysis for supervised dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
A comparison of generalized linear discriminant analysis algorithms
Pattern Recognition
A Novel Method of Combined Feature Extraction for Recognition
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Neighborhood MinMax projections
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
A new method of feature fusion and its application in image recognition
Pattern Recognition
Nonparametric Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalizing discriminant analysis using the generalized singular value decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
Dimension reduction is an important topic in data mining and machine learning. Especially dimension reduction combined with feature fusion is an effective preprocessing step when the data are described by multiple feature sets. Canonical Correlation Analysis (CCA) and Discriminative Canonical Correlation Analysis (DCCA) are feature fusion methods based on correlation. However, they are different in that DCCA is a supervised method utilizing class label information, while CCA is an unsupervised method. It has been shown that the classification performance of DCCA is superior to that of CCA due to the discriminative power using class label information. On the other hand, Linear Discriminant Analysis (LDA) is a supervised dimension reduction method and it is known as a special case of CCA. In this paper, we analyze the relationship between DCCA and LDA, showing that the projective directions by DCCA are equal to the ones obtained from LDA with respect to an orthogonal transformation. Using the relation with LDA, we propose a new method that can enhance the performance of DCCA. The experimental results show that the proposed method exhibits better classification performance than the original DCCA.