Matrix analysis
Optimal Fisher discriminant analysis using the rank decomposition
Pattern Recognition
Learning to extract symbolic knowledge from the World Wide Web
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Canonical Correlation Analysis and Least Squares Support Vector Machines
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
SIAM Journal on Matrix Analysis and Applications
Kernel independent component analysis
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
IDR/QR: an incremental dimension reduction algorithm via QR decomposition
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Acquiring Linear Subspaces for Face Recognition under Variable Lighting
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Matrix Analysis and Applications
SIAM Journal on Matrix Analysis and Applications
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
Generalizing discriminant analysis using the generalized singular value decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Penalized Fisher discriminant analysis and its application to image-based morphometry
Pattern Recognition Letters
A Rayleigh-Ritz style method for large-scale discriminant analysis
Pattern Recognition
Soft label based Linear Discriminant Analysis for image recognition and retrieval
Computer Vision and Image Understanding
Hi-index | 0.00 |
Fisher linear discriminant analysis (FDA) and its kernel extension--kernel discriminant analysis (KDA)--are well known methods that consider dimensionality reduction and classification jointly. While widely deployed in practical problems, there are still unresolved issues surrounding their efficient implementation and their relationship with least mean squares procedures. In this paper we address these issues within the framework of regularized estimation. Our approach leads to a flexible and efficient implementation of FDA as well as KDA. We also uncover a general relationship between regularized discriminant analysis and ridge regression. This relationship yields variations on conventional FDA based on the pseudoinverse and a direct equivalence to an ordinary least squares estimator.