A fast method for the implementation of common vector approach


KOÇ M., BARKANA A., Gerek O. N.

INFORMATION SCIENCES, cilt.180, sa.20, ss.4084-4098, 2010 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 180 Sayı: 20
  • Basım Tarihi: 2010
  • Doi Numarası: 10.1016/j.ins.2010.06.027
  • Dergi Adı: INFORMATION SCIENCES
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.4084-4098
  • Anahtar Kelimeler: Common vector approach, Fast classification algorithm, Classifier implementation, Face recognition, FACE-RECOGNITION
  • Anadolu Üniversitesi Adresli: Evet

Özet

In this paper a novel computation method is proposed to perform the common vector approach (CVA) faster than its conventional implementation in pattern recognition. While conventional CVA calculations perform the classification with respect to the distance between vectors, the new method performs the classification using scalars. A theoretical proof of the equivalence of the proposed method is provided. Next, in order to verify the numerical equivalence of the proposed computation method to the conventional (vector-based) method, numerical experiments are conducted over three different face databases, namely the AR Database, extended Yale Face Database B, and FERET Database. Since the computational gain may depend on (i) the dimension of the feature vectors, (ii) the number of feature vectors used in training, and (iii) the number of classes, the effects of these items are clearly verified via these databases. Our theoretically equivalent (but faster) method provided no difference in the classification rates despite its improved classification speed as compared to the classical implementation of CVA. The new method is found to be about 2.1-3.0 times faster than the conventional CVA implementation for the AR face database, 1.9-3.3 times faster for the extended Yale Face Database B, and 1.9-3.1 times faster for the FERET Database. (C) 2010 Elsevier Inc. All rights reserved.