Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Visual learning and recognition of 3-D objects from appearance
International Journal of Computer Vision
The nature of statistical learning theory
The nature of statistical learning theory
Using Discriminant Eigenfeatures for Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Introduction to Modern Information Retrieval
Introduction to Modern Information Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Journal of Cognitive Neuroscience
Expert Systems with Applications: An International Journal
An optimization criterion for generalized discriminant analysis on undersampled problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this study, a unified scheme using divergence analysis and genetic search is proposed to determine significant components of feature vectors in high-dimensional spaces, without having to deal with singular matrix problems. In the literature it is observed that three main problems exist in the feature selection process performed in a high-dimensional space. These problems are high computational load, local minima, and singular matrices. In this study, feature selection is realized by increasing the dimension one by one, rather than reducing the dimension. In this sense, the recursive covariance matrices are formulated to decrease the computational load. The use of genetic algorithms is proposed to avoid local optima and singular matrix problems in high-dimensional feature spaces. Candidate strings in the genetic pool represent the new features formed by increasing the dimension. The genetic algorithms investigate the combination of features which give the highest divergence value. In this study, two methods are proposed for the selection of features. In the first method, features in a high-dimensional space are determined by using divergence analysis and genetic search (DAGS) together. If the dimension is not high, the second method is offered which uses only recursive divergence analysis (RDA) without any genetic search. In Section 3 two experiments are presented: Feature determination in a two-dimensional phantom feature space, and feature determination for ECG beat classification in a real data space.