Mosul UniversityCollege Of Basic Education Research Journal1992-745218320220901A New Spectral Conjugate Gradient method for solving unconstrained Optimization problemsA New Spectral Conjugate Gradient method for solving unconstrained Optimization problems77978817569710.33899/berj.2022.175697ENOsama M. T. WaisAlkassaabDepartment of Mathematics , College of Basic Education , University of Mosul0000-0002-7725-2260Khalil K.AbboUniversity of Telafer / College of Basic Education / Dept. of MathematicIbrahim A.SalehUniversity of Mosul / College of Computer Sciences and Mathematics / Dept. of Software EngineeringJournal Article20220316The spectral conjugate gradient methods are fascinating, and it has been shown that they are useful for strictly convex quadratic reduction when used properly. To handle large-scale unconstrained optimization issues, a novel spectral conjugate gradient approach is suggested in this study. We devise a new methodology for determining the spectral and conjugate parameters, motivated by the benefits of the approximate optimum step size strategy utilized in the gradient method. Additionally, the new search direction meets the spectral property as well as the sufficient descent criterion. The presented method's global convergence is established under a set of appropriate assumptions.<br /> Consider the unconstrained optimization problem with the following n variables : <br /> minf(x) ,x∈R^n (1)<br /> The conjugate gradient methods are among the most effective optimization strategies for achieving the solution of problem (1), where f:R^n→R is a continuous differentiable function. The conjugate gradient technique has the following form :<br /> x_(k+1)=x_k+α_k d_k ,k=0,1,2,3,… (2)<br /> Where x_0 is the starting point, α_k is a step size , g_k=∇f(x) and d_k can be taken as :<br /> d_k={█(-g_k ∶ k=0@-g_k+β_k d_(k-1) ∶ k≥1)┤ (3)The spectral conjugate gradient methods are fascinating, and it has been shown that they are useful for strictly convex quadratic reduction when used properly. To handle large-scale unconstrained optimization issues, a novel spectral conjugate gradient approach is suggested in this study. We devise a new methodology for determining the spectral and conjugate parameters, motivated by the benefits of the approximate optimum step size strategy utilized in the gradient method. Additionally, the new search direction meets the spectral property as well as the sufficient descent criterion. The presented method's global convergence is established under a set of appropriate assumptions.<br /> Consider the unconstrained optimization problem with the following n variables : <br /> minf(x) ,x∈R^n (1)<br /> The conjugate gradient methods are among the most effective optimization strategies for achieving the solution of problem (1), where f:R^n→R is a continuous differentiable function. The conjugate gradient technique has the following form :<br /> x_(k+1)=x_k+α_k d_k ,k=0,1,2,3,… (2)<br /> Where x_0 is the starting point, α_k is a step size , g_k=∇f(x) and d_k can be taken as :<br /> d_k={█(-g_k ∶ k=0@-g_k+β_k d_(k-1) ∶ k≥1)┤ (3)https://berj.mosuljournals.com/article_175697_75cc56bc7979a17e49ffed3cb7abc62a.pdf