Robust regression and outlier detection
Robust regression and outlier detection
The Strength of Weak Learnability
Machine Learning
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
A note on the orthonormal discriminant vector method for feature extraction
Pattern Recognition
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Mixtures of probabilistic principal component analyzers
Neural Computation
Estimating a Kernel Fisher Discriminant in the Presence of Label Noise
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Experiments with Noise Filtering in a Medical Domain
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
High breakdown mixture discriminant analysis
Journal of Multivariate Analysis
The Journal of Machine Learning Research
Supervised probabilistic principal component analysis
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
An Optimal Set of Discriminant Vectors
IEEE Transactions on Computers
A Flexible and Efficient Algorithm for Regularized Fisher Discriminant Analysis
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Simultaneous model-based clustering and visualization in the Fisher discriminative subspace
Statistics and Computing
Probabilistic linear discriminant analysis
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
The reduced nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
Generalizing discriminant analysis using the generalized singular value decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.01 |
Fisher discriminant analysis (FDA) is a popular and powerful method for dimensionality reduction and classification. Unfortunately, the optimality of the dimension reduction provided by FDA is only proved in the homoscedastic case. In addition, FDA is known to have poor performances in the cases of label noise and sparse labeled data. To overcome these limitations, this work proposes a probabilistic framework for FDA which relaxes the homoscedastic assumption on the class covariance matrices and adds a term to explicitly model the non-discriminative information. This allows the proposed method to be robust to label noise and to be used in the semi-supervised context. Experiments on real-world datasets show that the proposed approach works at least as well as FDA in standard situations and outperforms it in the label noise and sparse label cases.