An Optimal Transformation for Discriminant and Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
ECCV '96 Proceedings of the 4th European Conference on Computer Vision-Volume I - Volume I
Neural Computation
The CMU Pose, Illumination, and Expression (PIE) Database
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
IEEE Transactions on Pattern Analysis and Machine Intelligence
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
A least squares formulation for a class of generalized eigenvalue problems in machine learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Feature extraction based on Laplacian bidirectional maximum margin criterion
Pattern Recognition
A scalable two-stage approach for a class of dimensionality reduction techniques
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Dimensionality Reduction by Minimal Distance Maximization
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
A multi-manifold discriminant analysis method for image feature extraction
Pattern Recognition
Face recognition using recursive Fisher linear discriminant
IEEE Transactions on Image Processing
Hi-index | 0.01 |
In this paper, we develop a novel dimensionality reduction (DR) framework coined complete large margin linear discriminant analysis (CLMLDA). Inspired by several recently proposed DR methods, CLMLDA constructs two mathematical programming models by maximizing the minimum distance between each class center and the total class center respectively in the null space of within-class scatter matrix and its orthogonal complementary space. In this way, CLMLDA not only makes full use of the discriminative information contained in the whole feature space but also overcome the weakness of linear discriminant analysis (LDA) in dealing with the class separation problem. The solutions of CLMLDA follow from solving two nonconvex optimization problems, each of which is transformed to a series of convex quadratic programming problems by using the constrained concave-convex procedure first, and then solved by off-the-shelf optimization toolbox. Experiments on both toy and several publicly available databases demonstrate its feasibility and effectiveness.