Regularized Discriminant Analysis, Ridge Regression and Beyond

  • Authors:
  • Zhihua Zhang;Guang Dai;Congfu Xu;Michael I. Jordan

  • Affiliations:
  • -;-;-;-

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Fisher linear discriminant analysis (FDA) and its kernel extension--kernel discriminant analysis (KDA)--are well known methods that consider dimensionality reduction and classification jointly. While widely deployed in practical problems, there are still unresolved issues surrounding their efficient implementation and their relationship with least mean squares procedures. In this paper we address these issues within the framework of regularized estimation. Our approach leads to a flexible and efficient implementation of FDA as well as KDA. We also uncover a general relationship between regularized discriminant analysis and ridge regression. This relationship yields variations on conventional FDA based on the pseudoinverse and a direct equivalence to an ordinary least squares estimator.