Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian Approaches to Gaussian Mixture Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Bayesian parameter estimation via variational methods
Statistics and Computing
Empirical Performance Analysis of Linear Discriminant Classifiers
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
A deformable model for the recognition of human faces under arbitrary illumination
A deformable model for the recognition of human faces under arbitrary illumination
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Iterative Projected Clustering by Subspace Mining
IEEE Transactions on Knowledge and Data Engineering
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Matrix Analysis and Applications
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative Zernike and Pseudo Zernike Moments for Face Recognition
International Journal of Computer Vision and Image Processing
Hi-index | 0.01 |
The linear discriminant analysis (LDA) is a linear classifier which has proven to be powerful and competitive compared to the main state-of-the-art classifiers. However, the LDA algorithm assumes the sample vectors of each class are generated from underlying multivariate normal distributions of common covariance matrix with different means (i.e., homoscedastic data). This assumption has restricted the use of LDA considerably. Over the years, authors have defined several extensions to the basic formulation of LDA. One such method is the heteroscedastic LDA (HLDA) which is proposed to address the heteroscedasticity problem. Another method is the nonparametric DA (NDA) where the normality assumption is relaxed. In this paper, we propose a novel Bayesian logistic discriminant (BLD) model which can address both normality and heteroscedasticity problems. The normality assumption is relaxed by approximating the underlying distribution of each class with a mixture of Gaussians. Hence, the proposed BLD provides more flexibility and better classification performances than the LDA, HLDA and NDA. A subclass and multinomial versions of the BLD are proposed. The posterior distribution of the BLD model is elegantly approximated by a tractable Gaussian form using variational transformation and Jensen's inequality, allowing a straightforward computation of the weights. An extensive comparison of the BLD to the LDA, support vector machine (SVM), HLDA, NDA and subclass discriminant analysis (SDA), performed on artificial and real data sets, has shown the advantages and superiority of our proposed method. In particular, the experiments on face recognition have clearly shown a significant improvement of the proposed BLD over the LDA.