Application of Linear Regression Classification to low-dimensional datasets

  • Authors:
  • Mehmet Koç;Atalay Barkana

  • Affiliations:
  • -;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

The Traditional Linear Regression Classification (LRC) method fails when the number of data in the training set is greater than their dimensions. In this work, we proposed a new implementation of LRC to overcome this problem in the pattern recognition. The new form of LRC works even in the case of having low-dimensional excessive number of data. In order to explain the new form of LRC, the relation between the predictor and the correlation matrix of a class is shown first. Then for the derivation of LRC, the null space of the correlation matrix is generated by using the eigenvectors corresponding to the smallest eigenvalues. These eigenvectors are used to calculate the projection matrix in LRC. Also the equivalence of LRC and the method called Class-Featuring Information Compression (CLAFIC) is shown theoretically. TI Digit database and Multiple Feature dataset are used to illustrate the use of proposed improvement on LRC and CLAFIC.