The nature of statistical learning theory
The nature of statistical learning theory
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Optimization Theory and Applications
SIAM Journal on Matrix Analysis and Applications
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
The Journal of Machine Learning Research
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
Multi-class Discriminant Kernel Learning via Convex Programming
The Journal of Machine Learning Research
Robust Bayesian mixture modelling
Neurocomputing
Probabilistic linear discriminant analysis
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
Linear discriminant analysis for signatures
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Kernel discriminant analysis (KDA) is an effective approach for supervised nonlinear dimensionality reduction. Probabilistic models can be used with KDA to improve its robustness. However, the state of the art of such models could only handle binary class problems, which confines their application in many real world problems. To overcome this limitation, we propose a novel nonparametric probabilistic model based on Gaussian Process for KDA to handle multiclass problems. The model provides a novel Bayesian interpretation for KDA, which allows its parameters to be automatically tuned through the optimization of the marginal log-likelihood of the data. Empirical study demonstrates the efficacy of the proposed model.