Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
IEEE Transactions on Pattern Analysis and Machine Intelligence
From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
ECCV '96 Proceedings of the 4th European Conference on Computer Vision-Volume I - Volume I
Neural Computation
The CMU Pose, Illumination, and Expression (PIE) Database
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
Twin Support Vector Machines for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiK-MHKS: A Novel Multiple Kernel Learning Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sparsity preserving projections with applications to face recognition
Pattern Recognition
Linear time maximum margin clustering
IEEE Transactions on Neural Networks
LPP solution schemes for use with face recognition
Pattern Recognition
Dimensionality Reduction by Minimal Distance Maximization
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optimal Locality Regularized Least Squares Support Vector Machine via Alternating Optimization
Neural Processing Letters
Face Recognition by Regularized Discriminant Analysis
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Local Linear Discriminant Analysis Framework Using Sample Neighbors
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Recursive concave-convex Fisher Linear Discriminant (RPFLD) is a novel efficient dimension reduction method and has been successfully applied to image recognition. However, RPFLD suffers from singularity problem and may lose some useful discriminant information when applied to high-dimensional data. Moreover, RPFLD is computationally expensive because it has to solve a series of quadratic programming (QP) problems to obtain optimal solution. In order to improve the generalization performance of RPFLD and at the same time reduce its training burden, we propose a novel method termed as regularized least squares Fisher linear discriminant (RLS-FLD) in this paper. The central idea is to introduce regularization into RPFLD and simultaneously use the 2-norm loss function. In doing so, the objective function of RLS-FLD turns out to be positive-definite, thus avoiding singularity problem. To solve RLS-FLD, the concave-convex programming (CCP) algorithm is employed to convert the original nonconvex problem to a series of equality-constrained convex QP problems. Each optimization problem in this series has a closed-form solution in its primal formulation via classic Lagrangian method. The resulting RLS-FLD thus leads to much fast training speed and does not need any optimization packages. Meanwhile, theoretical analysis is provided to uncover the connections between RLS-FLD and regularized linear discriminant analysis (RLDA), thus giving more insight into the principle of RLS-FLD. The effectiveness of the proposed RLS-FLD is demonstrated by experimental results on some real-world handwritten digit, face and object recognition datasets.