Boosted discriminant projections for nearest neighbor classification
Pattern Recognition
Multi-view face and eye detection using discriminant features
Computer Vision and Image Understanding
A discriminant analysis using composite features for classification problems
Pattern Recognition
Metric learning by discriminant neighborhood embedding
Pattern Recognition
Info-margin maximization for feature extraction
Pattern Recognition Letters
Linear boundary discriminant analysis
Pattern Recognition
Mutual Neighborhood Based Discriminant Projection for Face Recognition
PReMI '09 Proceedings of the 3rd International Conference on Pattern Recognition and Machine Intelligence
SVM decision boundary based discriminative subspace induction
Pattern Recognition
A linear discriminant analysis method based on mutual information maximization
Pattern Recognition
Robust linearly optimized discriminant analysis
Neurocomputing
Proceedings of the 4th International Conference on Uniquitous Information Management and Communication
The contribution of external features to face recognition
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part II
Discriminant analysis based on kernelized decision boundary for face recognition
AVBPA'05 Proceedings of the 5th international conference on Audio- and Video-Based Biometric Person Authentication
Analysis of Correlation Based Dimension Reduction Methods
International Journal of Applied Mathematics and Computer Science - Issues in Advanced Control and Diagnosis
Supervised subspace projections for constructing ensembles of classifiers
Information Sciences: an International Journal
A boosting approach for supervised Mahalanobis distance metric learning
Pattern Recognition
Computers in Biology and Medicine
Hi-index | 0.14 |
A nonparametric method of discriminant analysis is proposed. It is based on nonparametric extensions of commonly used scatter matrices. Two advantages result from the use of the proposed nonparametric scatter matrices. First, they are generally of full rank. This provides the ability to specify the number of extracted features desired. This is in contrast to parametric discriminant analysis, which for an L class problem typically can determine at most L 1 features. Second, the nonparametric nature of the scatter matrices allows the procedure to work well even for non-Gaussian data sets. Using the same basic framework, a procedure is proposed to test the structural similarity of two distributions. The procedure works in high-dimensional space. It specifies a linear decomposition of the original data space in which a relative indication of dissimilarity along each new basis vector is provided. The nonparametric scatter matrices are also used to derive a clustering procedure, which is recognized as a k-nearest neighbor version of the nonparametric valley seeking algorithm. The form which results provides a unified view of the parametric nearest mean reclassification algorithm and the nonparametric valley seeking algorithm.