Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Linear Discriminant Analysis for Two Classes via Removal of Classification Structure
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Optimal Pairwise Linear Classifiers for Normal Distributions: The Two-Dimensional Case
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Bayes-Optimal Linear Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Hi-index | 0.00 |
In this paper, a new feature transformation method is introduced to decrease misclassification rate. Linear classifiers in general are not able to classify feature vectors which lie in a high dimensional feature space. When the feature vectors from difference classes have underlying distributions which are severely overlapped, it is even more difficult to classify those feature vectors with desirable performance. In this case, data reduction or feature transformation typically finds a feature subspace in which feature vectors can be well separated. However, it is still not possible to overcome misclassifications which results from the overlapping area. The proposed feature transformation increases the dimension of a feature vector by combining other feature vectors in the same class and then follows typical data reduction process. Significantly improved separability in terms of linear classifiers is achieved through such a sequential process and is identified in the experimental results.