Classification algorithms
Covariance Matrix Estimation and Classification With Limited Training Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Probabilistic Visual Learning for Object Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Network-Based Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Example-Based Learning for View-Based Human Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Principal Manifolds and Probabilistic Subspaces for Visual Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Global Dimensionality of Face Space
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Empirical Bayesian estimation of normal variances and covariances
Journal of Multivariate Analysis
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Estimation of true Mahalanobis distance from eigenvectors of sample covariance matrix
Systems and Computers in Japan
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Regularized single-kernel conditional density estimation for face description
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Hi-index | 0.00 |
This paper introduces an estimation technique for covariance matrices. The method differs from previous estimators in specifying an application-dependent cost function, regularizing all classes in the same way then compensating for volume distortions via scale parameters, and allowing m-fold rather than leave-one-out cross-validation. It provides a systematic basis for parameter estimation in high-dimensional spaces, where there are inevitably far too few training samples for reliable parameter estimates from sample statistics only. This is demonstrated with standard classifiers using normal models in the high dimensional space of appearance-based image processing. When the models are trained with the new technique, face classification performance is significantly better than with unregularized covariances and with earlier regularized estimators. Dimensionality reduction is also improved when it uses a covariance structure estimated with the method.