Keywords : ,،,؛Global convergence

A New Spectral Conjugate Gradient method for solving unconstrained Optimization problems

Osama M. T. Wais Alkassaab; Khalil K. Abbo; Ibrahim A. Saleh

College Of Basic Education Research Journal, 2022, Volume 18, Issue 3, Pages 779-788
DOI: 10.33899/berj.2022.175697

The spectral conjugate gradient methods are fascinating, and it has been shown that they are useful for strictly convex quadratic reduction when used properly. To handle large-scale unconstrained optimization issues, a novel spectral conjugate gradient approach is suggested in this study. We devise a new methodology for determining the spectral and conjugate parameters, motivated by the benefits of the approximate optimum step size strategy utilized in the gradient method. Additionally, the new search direction meets the spectral property as well as the sufficient descent criterion. The presented method's global convergence is established under a set of appropriate assumptions.
Consider the unconstrained optimization problem with the following n variables :
min⁡f(x) ,x∈R^n (1)
The conjugate gradient methods are among the most effective optimization strategies for achieving the solution of problem (1), where f:R^n→R is a continuous differentiable function. The conjugate gradient technique has the following form :
x_(k+1)=x_k+α_k d_k ,k=0,1,2,3,… (2)
Where x_0 is the starting point, α_k is a step size , g_k=∇f(x) and d_k can be taken as :
d_k={█(-g_k ∶ k=0@-g_k+β_k d_(k-1) ∶ k≥1)┤ (3)