Comparison of Visible, Thermal Infra-Red and Range Images for Face Recognition
PSIVT '09 Proceedings of the 3rd Pacific Rim Symposium on Advances in Image and Video Technology
Multi-task regularization of generative similarity models
SIMBAD'11 Proceedings of the First international conference on Similarity-based pattern recognition
K-means and adaptive k-means algorithms for clustering DNS traffic
Proceedings of the 5th International ICST Conference on Performance Evaluation Methodologies and Tools
International Journal of Artificial Intelligence and Soft Computing
Covariance Matrix Estimation with Multi-Regularization Parameters based on MDL Principle
Neural Processing Letters
Hi-index | 0.00 |
Quadratic discriminant analysis is a common tool for classification, but estimation of the Gaussian parameters can be ill-posed. This paper contains theoretical and algorithmic contributions to Bayesian estimation for quadratic discriminant analysis. A distribution-based Bayesian classifier is derived using information geometry. Using a calculus of variations approach to define a functional Bregman divergence for distributions, it is shown that the Bayesian distribution-based classifier that minimizes the expected Bregman divergence of each class conditional distribution also minimizes the expected misclassification cost. A series approximation is used to relate regularized discriminant analysis to Bayesian discriminant analysis. A new Bayesian quadratic discriminant analysis classifier is proposed where the prior is defined using a coarse estimate of the covariance based on the training data; this classifier is termed BDA7. Results on benchmark data sets and simulations show that BDA7 performance is competitive with, and in some cases significantly better than, regularized quadratic discriminant analysis and the cross-validated Bayesian quadratic discriminant analysis classifier Quadratic Bayes.