Application of Linear Regression Classification to low-dimensional datasets


KOÇ M., BARKANA A.

NEUROCOMPUTING, cilt.131, ss.331-335, 2014 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 131
  • Basım Tarihi: 2014
  • Doi Numarası: 10.1016/j.neucom.2013.10.009
  • Dergi Adı: NEUROCOMPUTING
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.331-335
  • Anahtar Kelimeler: Correlation matrix, Subspace methods, Linear Regression Classification, Class-featuring information compression, FACE RECOGNITION
  • Anadolu Üniversitesi Adresli: Evet

Özet

The Traditional Linear Regression Classification (LRC) method fails when the number of data in the training set is greater than their dimensions. In this work, we proposed a new implementation of LRC to overcome this problem in the pattern recognition. The new form of LRC works even in the case of having low-dimensional excessive number of data. In order to explain the new form of LRC, the relation between the predictor and the correlation matrix of a class is shown first. Then for the derivation of LRC, the null space of the correlation matrix is generated by using the eigenvectors corresponding to the smallest eigenvalues. These eigenvectors are used to calculate the projection matrix in LRC. Also the equivalence of LRC and the method called Class-Featuring Information Compression (CLAFIC) is shown theoretically. TI Digit database and Multiple Feature dataset are used to illustrate the use of proposed improvement on LRC and CLAFIC. (C) 2013 Elsevier B.V. All rights reserved.