Feature extraction approaches based on matrix pattern: MatPCA and MatFLDA

  • Authors:
  • Songcan Chen;Yulian Zhu;Daoqiang Zhang;Jing-Yu Yang

  • Affiliations:
  • Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu 210016, People's Republic of China;Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu 210016, People's Republic of China;Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu 210016, People's Republic of China;Department of Computer Science, Nanjing University of Science and Technology, Nanjing 210094, People's Republic of China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2005

Quantified Score

Hi-index 0.10

Visualization

Abstract

Principle component analysis (PCA) and Fisher linear discriminant analysis (FLDA), as two popular feature extraction approaches in Pattern recognition and data analysis, extract so-needed features directly based on vector patterns, i.e., before applying them, any non-vector pattern such as an image is first vectorized into a vector pattern by some technique like concatenation. However, such a vectorization has been proved not to be beneficial for image recognition due to consequences of both the algebraic feature extraction approach and 2DPCA. In this paper, inspired by the above two approaches, we try an opposite direction to extract features for any vector pattern by first matrixizing it into a matrix pattern and then applying the matrixized versions of PCA and FLDA, MatPCA and MatFLDA, to the pattern. MatFLDA uses, in essence, the same principle as the algebraic feature extraction approach and is constructed in terms of similar objective function to FLDA while MatPCA uses a minimization of the reconstructed error for the training samples like PCA to obtain a set of projection vectors, which is somewhat different derivation from 2DPCA despite of equivalence. Finally experiments on 10 publicly obtainable datasets show that both MatPCA and MatFLDA gain performance improvement in different degrees, respectively, on 7 and 5 datasets and at the same time, the computational burden of extracting features is largely reduced. In addition, it is noteworthy that the proposed approaches are still linear and the promotion of classification accuracy does not result from commonly-used non-linearization for the original linear approaches but from the simple matrixization. Furthermore, another prominent merit of matrixizing FLDA is that it can naturally break down the notorious rank limitation, that is, the number of discriminating vectors able to be found is bounded by C-1 for C class problem, and at the same time no additional computational cost is introduced.