Near optimal step size and momentum in gradient descent for quadratic functions


Creative Commons License

TAŞ E., MEMMEDLİ M.

TURKISH JOURNAL OF MATHEMATICS, cilt.41, sa.1, ss.110-121, 2017 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 41 Sayı: 1
  • Basım Tarihi: 2017
  • Doi Numarası: 10.3906/mat-1411-51
  • Dergi Adı: TURKISH JOURNAL OF MATHEMATICS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, TR DİZİN (ULAKBİM)
  • Sayfa Sayıları: ss.110-121
  • Anahtar Kelimeler: Gradient descent, step size, momentum, convergence speed, stability, STEEPEST DESCENT
  • Anadolu Üniversitesi Adresli: Evet

Özet

Many problems in statistical estimation, classification, and regression can be cast as optimization problems. Gradient descent, which is one of the simplest and easy to implement multivariate optimization techniques, lies at the heart of many powerful classes of optimization methods. However, its major disadvantage is the slower rate of convergence with respect to the other more sophisticated algorithms. In order to improve the convergence speed of gradient descent, we simultaneously determine near-optimal scalar step size and momentum factor for gradient descent in a deterministic quadratic bowl from the largest and smallest eigenvalues of the Hessian. The resulting algorithm is demonstrated on specific and randomly generated test problems and it converges faster than any previous batch gradient descent method.