Modified Quadratic Discriminant Functions and the Application to Chinese Character Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
Local Fisher discriminant analysis for supervised dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
Normalization-Cooperated Gradient Feature Extraction for Handwritten Character Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Proceedings of the 24th international conference on Machine learning
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
A novel kernel-based maximum a posteriori classification method
Neural Networks
Linear dimensionality reduction using relevance weighted LDA
Pattern Recognition
Distance metric learning by minimal distance maximization
Pattern Recognition
Dimensionality Reduction by Minimal Distance Maximization
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
CASIA Online and Offline Chinese Handwriting Databases
ICDAR '11 Proceedings of the 2011 International Conference on Document Analysis and Recognition
Logarithmic regret algorithms for online convex optimization
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Hi-index | 0.00 |
Linear Discriminant Analysis (LDA) is an important dimensionality reduction algorithm, but its performance is usually limited on multi-class data. Such limitation is incurred by the fact that LDA actually maximizes the average divergence among classes, whereby similar classes with smaller divergence tend to be merged in the subspace. To address this problem, we propose a novel dimensionality reduction method called Maxi-Min Discriminant Analysis (MMDA). In contrast to the traditional LDA, MMDA attempts to find a low-dimensional subspace by maximizing the minimal (worst-case) divergence among classes. This ''minimal'' setting overcomes the problem of LDA that tends to merge similar classes with smaller divergence when used for multi-class data. We formulate MMDA as a convex problem and further as a large-margin learning problem. One key contribution is that we design an efficient online learning algorithm to solve the involved problem, making the proposed method applicable to large scale data. Experimental results on various datasets demonstrate the efficiency and the efficacy of our proposed method against five other competitive approaches, and the scalability to the data with thousands of classes.