Application of Linear Regression Classification to low-dimensional datasets


NEUROCOMPUTING, vol.131, pp.331-335, 2014 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 131
  • Publication Date: 2014
  • Doi Number: 10.1016/j.neucom.2013.10.009
  • Journal Name: NEUROCOMPUTING
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.331-335
  • Keywords: Correlation matrix, Subspace methods, Linear Regression Classification, Class-featuring information compression, FACE RECOGNITION
  • Anadolu University Affiliated: Yes


The Traditional Linear Regression Classification (LRC) method fails when the number of data in the training set is greater than their dimensions. In this work, we proposed a new implementation of LRC to overcome this problem in the pattern recognition. The new form of LRC works even in the case of having low-dimensional excessive number of data. In order to explain the new form of LRC, the relation between the predictor and the correlation matrix of a class is shown first. Then for the derivation of LRC, the null space of the correlation matrix is generated by using the eigenvectors corresponding to the smallest eigenvalues. These eigenvectors are used to calculate the projection matrix in LRC. Also the equivalence of LRC and the method called Class-Featuring Information Compression (CLAFIC) is shown theoretically. TI Digit database and Multiple Feature dataset are used to illustrate the use of proposed improvement on LRC and CLAFIC. (C) 2013 Elsevier B.V. All rights reserved.