Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
The anatomy of a context-aware application
Wireless Networks - Selected Papers from Mobicom'99
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Parallel Computing of Eigenvalue of Doubly Stochastic Matrix
ICA3PP '02 Proceedings of the Fifth International Conference on Algorithms and Architectures for Parallel Processing
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative Common Vectors for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Two-Stage Linear Discriminant Analysis via QR-Decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Local Discriminant Embedding and Its Variants
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Regularized mixture discriminant analysis
Pattern Recognition Letters
2D-LDA: A statistical linear discriminant analysis for image matrix
Pattern Recognition Letters
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
Hi-index | 0.10 |
Subclass discriminant analysis (SDA) [Zhu, M., Martinez, A.M., 2006. Subclass discriminant analysis. IEEE Trans. Pattern Anal. Machine Intell., 28(8), pp. 1274-1286] is a dimensionality reduction method that has proven successful for different types of class distributions. In SDA, the reduction of dimensionality is not achieved by assuming that each class is represented by a single cluster, but rather by approximating the underlying distribution with a mixture of Gaussians. The advantage of SDA is that since it does not treat the class-conditional distributions as uni-modal ones, the nonlinearly separable problems can be handled as linear ones. The problem with this strategy, however, is that to estimate the number of subclasses needed to represent the distribution of each class, i.e., to find out the best partition, all possible solutions should be verified. Therefore, this approach leads to an associated high computational cost. In this paper, we propose a method that optimizes the computational burden of SDA-based classification by simply reducing the number of classes to be examined through choosing a few classes of the training set prior to the execution of the SDA. To select the classes to be partitioned, the intra-set distance is employed as a criterion and a k-means clustering is performed to divide them. Our experimental results for an artificial data set of XOR-type samples and three benchmark image databases of Kimia, AT&T, and Yale demonstrate that the processing CPU-time of the SDA optimized with the proposed scheme could be reduced dramatically without either sacrificing classification accuracy or increasing computational complexity.