Modified Quadratic Discriminant Functions and the Application to Chinese Character Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Independent component analysis: algorithms and applications
Neural Networks
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
Random projection in dimensionality reduction: applications to image and text data
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Feature Extraction Based on Decision Boundaries
IEEE Transactions on Pattern Analysis and Machine Intelligence
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative Common Vectors for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
On Feature Extraction for Limited Class Problem
ICPR '96 Proceedings of the 13th International Conference on Pattern Recognition - Volume 2
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Normalization-Cooperated Gradient Feature Extraction for Handwritten Character Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Distance metric learning by minimal distance maximization
Pattern Recognition
Dimensionality Reduction by Minimal Distance Maximization
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonparametric Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICDAR 2011 Chinese Handwriting Recognition Competition
ICDAR '11 Proceedings of the 2011 International Conference on Document Analysis and Recognition
CASIA Online and Offline Chinese Handwriting Databases
ICDAR '11 Proceedings of the 2011 International Conference on Document Analysis and Recognition
Experiments with random projection
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Discriminative learning quadratic discriminant function for handwriting recognition
IEEE Transactions on Neural Networks
Deep Learning Regularized Fisher Mappings
IEEE Transactions on Neural Networks
Multi-column deep neural networks for image classification
CVPR '12 Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Confused Distance Maximization for Large Category Dimensionality Reduction
ICFHR '12 Proceedings of the 2012 International Conference on Frontiers in Handwriting Recognition
Hi-index | 0.01 |
To improve the class separability of Fisher linear discriminant analysis (FDA) for large category problems, we investigate the weighted Fisher criterion (WFC) by integrating weighting functions for dimensionality reduction. The objective of WFC is to maximize the sum of weighted distances of all class pairs. By setting larger weights for the most confusable classes, WFC can improve the class separation while the solution remains an eigen-decomposition problem. We evaluate five weighting functions in three different weighting spaces in a typical large category problem of handwritten Chinese character recognition. The weighting functions include four based on existing methods, namely, FDA, approximate pairwise accuracy criterion (aPAC), power function (POW), confused distance maximization (CDM), and a new one based on K-nearest neighbors (KNN). All the weighting functions can be calculated in the original feature space, low-dimensional space, or fractional space. Our experiments on a 3,755-class Chinese handwriting database demonstrate that WFC can improve the classification accuracy significantly compared to FDA. Among the weighting functions, the KNN method in the original space is the most competitive model which achieves significantly higher classification accuracy and has a low computational complexity. To further improve the performance, we propose a nonparametric extension of the KNN method from the class level to the sample level. The sample level KNN (SKNN) method is shown to outperform significantly other methods in Chinese handwriting recognition such as the locally linear discriminant analysis (LLDA), neighbor class linear discriminant analysis (NCLDA), and heteroscedastic linear discriminant analysis (HLDA).